1000 resultados para RECURSIVE APPROACH
Resumo:
The stock market suffers uncertain relations throughout the entire negotiation process, with different variables exerting direct and indirect influence on stock prices. This study focuses on the analysis of certain aspects that may influence these values offered by the capital market, based on the Brazil Index of the Sao Paulo Stock Exchange (Bovespa), which selects 100 stocks among the most traded on Bovespa in terms of number of trades and financial volume. The selected variables are characterized by the companies` activity area and the business volume in the month of data collection, i.e. April/2007. This article proposes an analysis that joins the accounting view of the stock price variables that can be influenced with the use of multivariate qualitative data analysis. Data were explored through Correspondence Analysis (Anacor) and Homogeneity Analysis (Homals). According to the research, the selected variables are associated with the values presented by the stocks, which become an internal control instrument and a decision-making tool when it comes to choosing investments.
Resumo:
This paper is part of a large study to assess the adequacy of the use of multivariate statistical techniques in theses and dissertations of some higher education institutions in the area of marketing with theme of consumer behavior from 1997 to 2006. The regression and conjoint analysis are focused on in this paper, two techniques with great potential of use in marketing studies. The objective of this study was to analyze whether the employement of these techniques suits the needs of the research problem presented in as well as to evaluate the level of success in meeting their premisses. Overall, the results suggest the need for more involvement of researchers in the verification of all the theoretical precepts of application of the techniques classified in the category of investigation of dependence among variables.
Resumo:
This paper offers some preliminary steps in the marriage of some of the theoretical foundations of new economic geography with spatial computable general equilibrium models. Modelling the spatial economy of Colombia using the traditional assumptions of computable general equilibrium (CGE) models makes little sense when one territorial unit, Bogota, accounts for over one quarter of GDP and where transportation costs are high and accessibility low compared to European or North American standards. Hence, handling market imperfections becomes imperative as does the need to address internal spatial issues from the perspective of Colombia`s increasing involvement with external markets. The paper builds on the Centro de Estudios de Economia Regional (CEER) model, a spatial CGE model of the Colombian economy; non-constant returns and non-iceberg transportation costs are introduced and some simulation exercises carried out. The results confirm the asymmetric impacts that trade liberalization has on a spatial economy in which one region, Bogota, is able to more fully exploit scale economies vis--vis the rest of Colombia. The analysis also reveals the importance of different hypotheses on factor mobility and the role of price effects to better understand the consequences of trade opening in a developing economy.
Resumo:
This article intends to rationally reconstruct Locke`s theory of knowledge as incorporated in a research program concerning the nature and structure of the theories and models of rationality. In previous articles we argued that the rationalist program can be subdivided into the classical rationalistic subprogram, which includes the knowledge theories of Descartes, Locke, Hume and Kant, the neoclassical subprogram, which includes the approaches of Duhem, Poincare and Mach, and the critical subprogram of Popper. The subdivision results from the different views of rationality proposed by each one of these subprograms, as well as from the tools made available by each one of them, containing theoretical instruments used to arrange, organize and develop the discussion on rationality, the main one of which is the structure of solution of problems. In this essay we intend to reconstruct the assumptions of Locke`s theory of knowledge, which in our view belongs to the classical rationalistic subprogram because it shares with it the thesis of the identity of (scientific) knowledge and certain knowledge.
Resumo:
This paper addresses the investment decisions considering the presence of financial constraints of 373 large Brazilian firms from 1997 to 2004, using panel data. A Bayesian econometric model was used considering ridge regression for multicollinearity problems among the variables in the model. Prior distributions are assumed for the parameters, classifying the model into random or fixed effects. We used a Bayesian approach to estimate the parameters, considering normal and Student t distributions for the error and assumed that the initial values for the lagged dependent variable are not fixed, but generated by a random process. The recursive predictive density criterion was used for model comparisons. Twenty models were tested and the results indicated that multicollinearity does influence the value of the estimated parameters. Controlling for capital intensity, financial constraints are found to be more important for capital-intensive firms, probably due to their lower profitability indexes, higher fixed costs and higher degree of property diversification.
Resumo:
Valuation of projects for the preservation of water resources provides important information to policy makers and funding institutions. Standard contingent valuation models rely on distributional assumptions to provide welfare measures. Deviations from assumed and actual distribution of benefits are important when designing policies in developing countries, where inequality is a concern. This article applies semiparametric methods to obtain estimates of the benefit from a project for the preservation of an important Brazilian river basin. These estimates lead to significant differences from those obtained using the standard parametric approach.
Resumo:
This paper develops a multi-regional general equilibrium model for climate policy analysis based on the latest version of the MIT Emissions Prediction and Policy Analysis (EPPA) model. We develop two versions so that we can solve the model either as a fully inter-temporal optimization problem (forward-looking, perfect foresight) or recursively. The standard EPPA model on which these models are based is solved recursively, and it is necessary to simplify some aspects of it to make inter-temporal solution possible. The forward-looking capability allows one to better address economic and policy issues such as borrowing and banking of GHG allowances, efficiency implications of environmental tax recycling, endogenous depletion of fossil resources, international capital flows, and optimal emissions abatement paths among others. To evaluate the solution approaches, we benchmark each version to the same macroeconomic path, and then compare the behavior of the two versions under a climate policy that restricts greenhouse gas emissions. We find that the energy sector and CO(2) price behavior are similar in both versions (in the recursive version of the model we force the inter-temporal theoretical efficiency result that abatement through time should be allocated such that the CO(2) price rises at the interest rate.) The main difference that arises is that the macroeconomic costs are substantially lower in the forward-looking version of the model, since it allows consumption shifting as an additional avenue of adjustment to the policy. On the other hand, the simplifications required for solving the model as an optimization problem, such as dropping the full vintaging of the capital stock and fewer explicit technological options, likely have effects on the results. Moreover, inter-temporal optimization with perfect foresight poorly represents the real economy where agents face high levels of uncertainty that likely lead to higher costs than if they knew the future with certainty. We conclude that while the forward-looking model has value for some problems, the recursive model produces similar behavior in the energy sector and provides greater flexibility in the details of the system that can be represented. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Microbial xylanolytic enzymes have a promising biotechnological potential, and are extensively applied in industries. In this study, induction of xylanolytic activity was examined in Aspergillus phoenicis. Xylanase activity induced by xylan, xylose or beta-methylxyloside was predominantly extracellular (93-97%). Addition of 1% glucose to media supplemented with xylan or xylose repressed xylanase production. Glucose repression was alleviated by addition of cAMP or dibutyryl-cAMP. These physiological observations were supported by a Northern analysis using part of the xylanase gene ApXLN as a probe. Gene transcription was shown to be induced by xylan, xylose, and beta-methylxyloside, and was repressed by the addition of 1% glucose. Glucose repression was partially relieved by addition of cAMP or dibutyryl cAMP.
Resumo:
A two-component survival mixture model is proposed to analyse a set of ischaemic stroke-specific mortality data. The survival experience of stroke patients after index stroke may be described by a subpopulation of patients in the acute condition and another subpopulation of patients in the chronic phase. To adjust for the inherent correlation of observations due to random hospital effects, a mixture model of two survival functions with random effects is formulated. Assuming a Weibull hazard in both components, an EM algorithm is developed for the estimation of fixed effect parameters and variance components. A simulation study is conducted to assess the performance of the two-component survival mixture model estimators. Simulation results confirm the applicability of the proposed model in a small sample setting. Copyright (C) 2004 John Wiley Sons, Ltd.
Resumo:
We report a simple one pot process for the preparation of lead sulfide (PbS) nanocrystals in the conjugated polymer poly (2-methoxy-5-(2'-ethyl-hexyloxy)-p-phenylene vinylene) (MEH-PPV), and we demonstrate electronic coupling between the two components.
Resumo:
This study is part of a larger project on the measurement of effective health consumers in the context of musculoskeletal illness. This complex issue involves the progressive nature of the disease, invisibility of the illness and attendant impairments, complexity of decision-making and negotiation, and urgent need to translate emergent evidence about treatment and management to patients and health professionals. We conducted indepth interviews with patients, family members, general practitioners, specialist clinicians, and health consumer advocates (N = 84) about effective consumers in this context, using a process of convergent interviewing, with convergence conducted within and across groups and countries. The initial set of themes included information seeking and adaptation, decision-making, roles of patients, GPs, and specialists and communication between them, importance of pain and impact of depression, impact of the social environment (including the invisibility of the disease and the need for a normal life), and coping strategies.
Resumo:
Teen Triple P is a multilevel system of intervention that is designed to provide parents with specific strategies to promote the positive development of their teenage children as they make the transition into high school and through puberty. The program is based on a combination of education about the developmental needs of adolescents, skills training to improve communication and problem-solving, plus specific modules to deal with common problems encountered by parents and adolescents that can escalate into major conflict and violence. It is designed to increase the engagement of parents of adolescent and pre-adolescent children by providing them with easy access to evidencebased parenting advice and support. This paper presents data collected as part of a survey of over 1400 students in first year high school at 9 Brisbane schools. The survey instrument was constructed to obtain students' reports about behaviour which is known to be associated with their health and wellbeing, and also on the extent to which their parents promoted or discouraged such behaviour at home, at school, and in their social and recreational activities in the wider community. Selected data from the survey were extracted and presented to parents at a series of parenting seminars held at the schools to promote appropriate parenting of teenagers. The objectives were to provide parents with accurate data about teenagers' behaviour, and about teenagers' reports of how they perceived their parents' behaviour. Normative data on parent and teenager behaviour will be presented from the survey as well as psychometric data relating to the reliability and validity of this new measure. Implications of this strategy for increasing parent engagement in parenting programs that aim to reduce behavioural and emotional problems in adolescents will be discussed.
Resumo:
We prove that, once an algorithm of perfect simulation for a stationary and ergodic random field F taking values in S(Zd), S a bounded subset of R(n), is provided, the speed of convergence in the mean ergodic theorem occurs exponentially fast for F. Applications from (non-equilibrium) statistical mechanics and interacting particle systems are presented.
Resumo:
The core structure of the natural sesquiterpene lactones furanoheliangolides, an 11-oxabicyclo[6.2.1]undecane system, was synthesized through a pathway involving two Diels-Alder reactions. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Sound application of molecular epidemiological principles requires working knowledge of both molecular biological and epidemiological methods. Molecular tools have become an increasingly important part of studying the epidemiology of infectious agents. Molecular tools have allowed the aetiological agent within a population to be diagnosed with a greater degree of efficiency and accuracy than conventional diagnostic tools. They have increased the understanding of the pathogenicity, virulence, and host-parasite relationships of the aetiological agent, provided information on the genetic structure and taxonomy of the parasite and allowed the zoonotic potential of previously unidentified agents to be determined. This review describes the concept of epidemiology and proper study design, describes the array of currently available molecular biological tools and provides examples of studies that have integrated both disciplines to successfully unravel zoonotic relationships that would otherwise be impossible utilising conventional diagnostic tools. The current limitations of applying these tools, including cautions that need to be addressed during their application are also discussed.(c) 2005 Australian Society for Parasitology Inc. Published by Elsevier Ltd. All rights reserved.