936 resultados para errors-in-variables model
Resumo:
There is a family of models with Physical, Human capital and R&D for which convergence properties have been discussed (Arnold, 2000a; Gómez, 2005). However, spillovers in R&D have been ignored in this context. We introduce spillovers in this model and derive its steady-state and stability properties. This new feature implies that the model is characterized by a system of four differential equations. A unique Balanced Growth Path along with a two dimensional stable manifold are obtained under simple and reasonable conditions. Transition is oscillatory toward the steady-state for plausible values of parameters.
Resumo:
Dissertação para obtenção do Grau de Doutor em Engenharia Informática
Função visual e desempenho na leitura em crianças do 1º ciclo do ensino básico do concelho de Lisboa
Resumo:
RESUMO - Esta tese pretende ser um contributo para o estudo das anomalias da função visual e da sua influência no desempenho da leitura. Apresentava como objetivos: (1) Identificar a prevalência de anomalias da função visual, (2) Caracterizar o desempenho da leitura em crianças com e sem anomalias da função visual, (3) Identificar de que modo as anomalias da função visual influenciam o desempenho da leitura e (4) Identificar o impacto das variáveis que determinam o desempenho da leitura. Foi recolhida uma amostra de conveniência com 672 crianças do 1º ciclo do ensino básico de 11 Escolas do Concelho de Lisboa com idades compreendidas entre os 6 e os 11 anos (7,69±1,19), 670 encarregados de educação e 34 Professores. Para recolha de dados, foram utilizados três instrumentos: 2 questionários de perguntas fechadas, avaliação da função visual e prova de avaliação da leitura com 34 palavras. Após observadas, as crianças foram classificadas em dois grupos: função visual normal (FVN=562) e função visual alterada (FVA=110). Identificou-se uma prevalência de 16,4% de crianças com FVA. No teste de leitura, estas crianças apresentaram um menor número de palavras lidas corretamente (FVA=31,00; FVN=33,00; p<0,001) e menor precisão (FVA=91,18%; FVN=97,06%; p<0,001). Esta tendência também foi observada na comparação entre os 4 anos de escolaridade. As crianças com função visual alterada mostraram uma tendência para a omissão de letras e a confusão de grafema. Quanto à fluência (FVA=24,71; FVN=27,39; p=0,007) esta foi inferior nas crianças com FVA para todos os anos de escolaridade, exceto o 3º ano. As crianças com hipermetropia (p=0,003) e astigmatismo (p=0,019) não corrigido leram menos palavras corretamente (30,00; 31,00) e com menor precisão (88,24%; 91,18%) que as crianças sem erro refrativo significativo (32,00; 94,12%). A performance escolar classificada pelos professores foi inferior nas crianças com FVA e mais de ¼ necessitavam de medidas de apoio especial na escola. Não se verificaram diferenças significativas na performance da leitura das crianças com FVA por grupos de habilitações dos encarregados de educação. Verificou-se que o risco de ter um desempenho na leitura alterado é superior [OR=4,29; I.C.95%(2,49;7,38)] nas crianças que apresentam FVA. Relativamente ao 1º ano de escolaridade, o 2º, 3º e 4º anos apresentam um menor risco de ter um desempenho na leitura alterado. As variáveis método de ensino, habilitações dos encarregados de educação, tipo de escola (pública/privada), idade do Professor e número de anos de experiência do Professor, não foram fatores estatisticamente significativos para explicar a alteração do desempenho na leitura, quando o efeito da função visual se encontra contemplado no modelo. Um mau desempenho na leitura foi considerado nas crianças que apresentaram uma precisão inferior a 90%. Este indicador pode ser utilizado para identificar crianças em risco, que necessitam de uma observação Ortóptica/Oftalmológica para confirmação ou exclusão da existência de alterações da função visual. Este trabalho constitui um contributo para a identificação de crianças em desvantagem educacional devido a anomalias da função visual tratáveis, propondo um modelo que pretende orientar os professores na identificação de crianças que apresentem um baixo desempenho na leitura.
Resumo:
New therapeutic alternatives against leishmaniasis remain a priority. The activity of azithromycin against Leishmania (Leishmania) major has been previously demonstrated. Different responses among species of Leishmania make species-specific drug screening necessary. The activity of azithromycin against Leishmania (Viannia) braziliensis and Leishmania (Leishmania) amazonensis was evaluated in golden hamsters infected through footpad injections of metacyclic promastigotes, and compared with untreated controls and animals treated with meglumine antimoniate. Footpad thickness, lesion cultures and dissemination sites were analyzed. Treatment of golden hamsters with oral azithromycin at 450mg/kg had no activity against infections with Leishmania (Leishmania) amazonensis. For infections due to Leishmania (Viannia) braziliensis, azithromycin demonstrated significant activity relative to untreated controls, but inferior to meglumine antimoniate, for controlling lesion size. Neither drug was able to totally eliminate parasites from the lesions. It was concluded that azithromycin has activity against Leishmania (Viannia) braziliensis but not against Leishmania (Leishmania) amazonensis in this model.
Resumo:
The Corporate world is becoming more and more competitive. This leads organisations to adapt to this reality, by adopting more efficient processes, which result in a decrease in cost as well as an increase of product quality. One of these processes consists in making proposals to clients, which necessarily include a cost estimation of the project. This estimation is the main focus of this project. In particular, one of the goals is to evaluate which estimation models fit the Altran Portugal software factory the most, the organization where the fieldwork of this thesis will be carried out. There is no broad agreement about which is the type of estimation model more suitable to be used in software projects. Concerning contexts where there is plenty of objective information available to be used as input to an estimation model, model-based methods usually yield better results than the expert judgment. However, what happens more frequently is not having this volume and quality of information, which has a negative impact in the model-based methods performance, favouring the usage of expert judgement. In practice, most organisations use expert judgment, making themselves dependent on the expert. A common problem found is that the performance of the expert’s estimation depends on his previous experience with identical projects. This means that when new types of projects arrive, the estimation will have an unpredictable accuracy. Moreover, different experts will make different estimates, based on their individual experience. As a result, the company will not directly attain a continuous growing knowledge about how the estimate should be carried. Estimation models depend on the input information collected from previous projects, the size of the project database and the resources available. Altran currently does not store the input information from previous projects in a systematic way. It has a small project database and a team of experts. Our work is targeted to companies that operate in similar contexts. We start by gathering information from the organisation in order to identify which estimation approaches can be applied considering the organization’s context. A gap analysis is used to understand what type of information the company would have to collect so that other approaches would become available. Based on our assessment, in our opinion, expert judgment is the most adequate approach for Altran Portugal, in the current context. We analysed past development and evolution projects from Altran Portugal and assessed their estimates. This resulted in the identification of common estimation deviations, errors, and patterns, which lead to the proposal of metrics to help estimators produce estimates leveraging past projects quantitative and qualitative information in a convenient way. This dissertation aims to contribute to more realistic estimates, by identifying shortcomings in the current estimation process and supporting the self-improvement of the process, by gathering as much relevant information as possible from each finished project.
Resumo:
This work models the competitive behaviour of individuals who maximize their own utility managing their network of connections with other individuals. Utility is taken as a synonym of reputation in this model. Each agent has to decide between two variables: the quality of connections and the number of connections. Hence, the reputation of an individual is a function of the number and the quality of connections within the network. On the other hand, individuals incur in a cost when they improve their network of contacts. The initial value of the quality and number of connections of each individual is distributed according to an initial (given) distribution. The competition occurs over continuous time and among a continuum of agents. A mean field game approach is adopted to solve the model, leading to an optimal trajectory for the number and quality of connections for each individual.
Resumo:
This paper considers model spaces in an Hp setting. The existence of unbounded functions and the characterisation of maximal functions in a model space are studied, and decomposition results for Toeplitz kernels, in terms of model spaces, are established
Resumo:
OBJECTIVE: To investigate preoperative predictive factors of severe perioperative intercurrent events and in-hospital mortality in coronary artery bypass graft (CABG) surgery and to develop specific models of risk prediction for these events, mainly those that can undergo changes in the preoperative period. METHODS: We prospectively studied 453 patients who had undergone CABG. Factors independently associated with the events of interest were determined with multiple logistic regression and Cox proportional hazards regression model. RESULTS: The mortality rate was 11.3% (51/453), and 21.2% of the patients had 1 or more perioperative intercurrent events. In the final model, the following variables remained associated with the risk of intercurrent events: age ³ 70 years, female sex, hospitalization via SUS (Sistema Único de Saúde - the Brazilian public health system), cardiogenic shock, ischemia, and dependence on dialysis. Using multiple logistic regression for in-hospital mortality, the following variables participated in the model of risk prediction: age ³ 70 years, female sex, hospitalization via SUS, diabetes, renal dysfunction, and cardiogenic shock. According to the Cox regression model for death within the 7 days following surgery, the following variables remained associated with mortality: age ³ 70 years, female sex, cardiogenic shock, and hospitalization via SUS. CONCLUSION: The aspects linked to the structure of the Brazilian health system, such as factors of great impact on the results obtained, indicate that the events investigated also depend on factors that do not relate to the patient's intrinsic condition.
Resumo:
This comment corrects the errors in the estimation process that appear in Martins (2001). The first error is in the parametric probit estimation, as the previously presented results do not maximize the log-likelihood function. In the global maximum more variables become significant. As for the semiparametric estimation method, the kernel function used in Martins (2001) can take on both positive and negative values, which implies that the participation probability estimates may be outside the interval [0,1]. We have solved the problem by applying local smoothing in the kernel estimation, as suggested by Klein and Spady (1993).
Resumo:
INTRODUCTION: Hip fractures are responsible for excessive mortality, decreasing the 5-year survival rate by about 20%. From an economic perspective, they represent a major source of expense, with direct costs in hospitalization, rehabilitation, and institutionalization. The incidence rate sharply increases after the age of 70, but it can be reduced in women aged 70-80 years by therapeutic interventions. Recent analyses suggest that the most efficient strategy is to implement such interventions in women at the age of 70 years. As several guidelines recommend bone mineral density (BMD) screening of postmenopausal women with clinical risk factors, our objective was to assess the cost-effectiveness of two screening strategies applied to elderly women aged 70 years and older. METHODS: A cost-effectiveness analysis was performed using decision-tree analysis and a Markov model. Two alternative strategies, one measuring BMD of all women, and one measuring BMD only of those having at least one risk factor, were compared with the reference strategy "no screening". Cost-effectiveness ratios were measured as cost per year gained without hip fracture. Most probabilities were based on data observed in EPIDOS, SEMOF and OFELY cohorts. RESULTS: In this model, which is mostly based on observed data, the strategy "screen all" was more cost effective than "screen women at risk." For one woman screened at the age of 70 and followed for 10 years, the incremental (additional) cost-effectiveness ratio of these two strategies compared with the reference was 4,235 euros and 8,290 euros, respectively. CONCLUSION: The results of this model, under the assumptions described in the paper, suggest that in women aged 70-80 years, screening all women with dual-energy X-ray absorptiometry (DXA) would be more effective than no screening or screening only women with at least one risk factor. Cost-effectiveness studies based on decision-analysis trees maybe useful tools for helping decision makers, and further models based on different assumptions should be performed to improve the level of evidence on cost-effectiveness ratios of the usual screening strategies for osteoporosis.
Resumo:
A 3D in vitro model of rat organotypic brain cell cultures in aggregates was used to investigate neurotoxicity mechanisms in glutaric aciduria type I (GA-I). 1 mM glutarate (GA) or 3-hydroxyglutarate (3OHGA) were repeatedly added to the culture media at two different time points. In cultures treated with 3OHGA, we observed an increase in lactate in the medium, pointing to a possible inhibition of Krebs cycle and respiratory chain. We further observed that 3OHGA and to a lesser extend GA induced an increase in ammonia production with concomitant decrease of glutamine concentrations, which may suggest an inhibition of the astrocytic enzyme glutamine synthetase. These previously unreported findings may uncover a pathogenic mechanism in this disease which has deleterious effects on early stages of brain development. By immunohistochemistry we showed that 3OHGA increased non-apoptotic cell death. On the cellular level, 3OHGA and to a lesser extend GA led to cell swelling and loss of astrocytic fibers whereas a loss of oligodendrocytes was only observed for 3OHGA. We conclude that 3OHGAwas the most toxic metabolite in our model for GA-I. 3OHGA induced deleterious effects on glial cells, an increase of ammonia production, and resulted in accentuated cell death of non-apoptotic origin.
Resumo:
This paper shows that tourism specialisation can help to explain the observed high growth rates of small countries. For this purpose, two models of growth and trade are constructed to represent the trade relations between two countries. One of the countries is large, rich, has an own source of sustained growth and produces a tradable capital good. The other is a small poor economy, which does not have an own engine of growth and produces tradable tourism services. The poor country exports tourism services to and imports capital goods from the rich economy. In one model tourism is a luxury good, while in the other the expenditure elasticity of tourism imports is unitary. Two main results are obtained. In the long run, the tourism country overcomes decreasing returns and permanently grows because its terms of trade continuously improve. Since the tourism sector is relatively less productive than the capital good sector, tourism services become relatively scarcer and hence more expensive than the capital good. Moreover, along the transition the growth rate of the tourism economy holds well above the one of the rich country for a long time. The growth rate differential between countries is particularly high when tourism is a luxury good. In this case, there is a faster increase in the tourism demand. As a result, investment of the small economy is boosted and its terms of trade highly improve.
Resumo:
Expectations about the future are central for determination of current macroeconomic outcomes and the formulation of monetary policy. Recent literature has explored ways for supplementing the benchmark of rational expectations with explicit models of expectations formation that rely on econometric learning. Some apparently natural policy rules turn out to imply expectational instability of private agents’ learning. We use the standard New Keynesian model to illustrate this problem and survey the key results about interest-rate rules that deliver both uniqueness and stability of equilibrium under econometric learning. We then consider some practical concerns such as measurement errors in private expectations, observability of variables and learning of structural parameters required for policy. We also discuss some recent applications including policy design under perpetual learning, estimated models with learning, recurrent hyperinflations, and macroeconomic policy to combat liquidity traps and deflation.
Resumo:
Untreated wastewater being directly discharged into rivers is a very harmful environmental hazard that needs to be tackled urgently in many countries. In order to safeguard the river ecosystem and reduce water pollution, it is important to have an effluent charge policy that promotes the investment of wastewater treatment technology by domestic firms. This paper considers the strategic interaction between the government and the domestic firms regarding the investment in the wastewater treatment technology and the design of optimal effluent charge policy that should be implemented. In this model, the higher is the proportion of non-investing firms, the higher would be the probability of having to incur an effluent charge and the higher would be that charge. On one hand the government needs to impose a sufficiently strict policy to ensure that firms have strong incentive to invest. On the other hand, it cannot be too strict that it drives out firms which cannot afford to invest in such expensive technology. The paper analyses the factors that affect the probability of investment in this technology. It also explains the difficulty of imposing a strict environment policy in countries that have too many small firms which cannot afford to invest unless subsidised.
Resumo:
Empirical researchers interested in how governance shapes various aspects of economic development frequently use the Worldwide Governance indicators (WGI). These variables come in the form of an estimate along with a standard error reflecting the uncertainty of this estimate. Existing empirical work simply uses the estimates as an explanatory variable and discards the information provided by the standard errors. In this paper, we argue that the appropriate practice should be to take into account the uncertainty around the WGI estimates through the use of multiple imputation. We investigate the importance of our proposed approach by revisiting in three applications the results of recently published studies. These applications cover the impact of governance on (i) capital flows; (ii) international trade; (iii) income levels around the world. We generally find that the estimated effects of governance are highly sensitive to the use of multiple imputation. We also show that model misspecification is a concern for the results of our reference studies. We conclude that the effects of governance are hard to establish once we take into account uncertainty around both the WGI estimates and the correct model specification.