154 resultados para Kinetic Approach


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The stock market suffers uncertain relations throughout the entire negotiation process, with different variables exerting direct and indirect influence on stock prices. This study focuses on the analysis of certain aspects that may influence these values offered by the capital market, based on the Brazil Index of the Sao Paulo Stock Exchange (Bovespa), which selects 100 stocks among the most traded on Bovespa in terms of number of trades and financial volume. The selected variables are characterized by the companies` activity area and the business volume in the month of data collection, i.e. April/2007. This article proposes an analysis that joins the accounting view of the stock price variables that can be influenced with the use of multivariate qualitative data analysis. Data were explored through Correspondence Analysis (Anacor) and Homogeneity Analysis (Homals). According to the research, the selected variables are associated with the values presented by the stocks, which become an internal control instrument and a decision-making tool when it comes to choosing investments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper is part of a large study to assess the adequacy of the use of multivariate statistical techniques in theses and dissertations of some higher education institutions in the area of marketing with theme of consumer behavior from 1997 to 2006. The regression and conjoint analysis are focused on in this paper, two techniques with great potential of use in marketing studies. The objective of this study was to analyze whether the employement of these techniques suits the needs of the research problem presented in as well as to evaluate the level of success in meeting their premisses. Overall, the results suggest the need for more involvement of researchers in the verification of all the theoretical precepts of application of the techniques classified in the category of investigation of dependence among variables.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper offers some preliminary steps in the marriage of some of the theoretical foundations of new economic geography with spatial computable general equilibrium models. Modelling the spatial economy of Colombia using the traditional assumptions of computable general equilibrium (CGE) models makes little sense when one territorial unit, Bogota, accounts for over one quarter of GDP and where transportation costs are high and accessibility low compared to European or North American standards. Hence, handling market imperfections becomes imperative as does the need to address internal spatial issues from the perspective of Colombia`s increasing involvement with external markets. The paper builds on the Centro de Estudios de Economia Regional (CEER) model, a spatial CGE model of the Colombian economy; non-constant returns and non-iceberg transportation costs are introduced and some simulation exercises carried out. The results confirm the asymmetric impacts that trade liberalization has on a spatial economy in which one region, Bogota, is able to more fully exploit scale economies vis--vis the rest of Colombia. The analysis also reveals the importance of different hypotheses on factor mobility and the role of price effects to better understand the consequences of trade opening in a developing economy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article intends to rationally reconstruct Locke`s theory of knowledge as incorporated in a research program concerning the nature and structure of the theories and models of rationality. In previous articles we argued that the rationalist program can be subdivided into the classical rationalistic subprogram, which includes the knowledge theories of Descartes, Locke, Hume and Kant, the neoclassical subprogram, which includes the approaches of Duhem, Poincare and Mach, and the critical subprogram of Popper. The subdivision results from the different views of rationality proposed by each one of these subprograms, as well as from the tools made available by each one of them, containing theoretical instruments used to arrange, organize and develop the discussion on rationality, the main one of which is the structure of solution of problems. In this essay we intend to reconstruct the assumptions of Locke`s theory of knowledge, which in our view belongs to the classical rationalistic subprogram because it shares with it the thesis of the identity of (scientific) knowledge and certain knowledge.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Valuation of projects for the preservation of water resources provides important information to policy makers and funding institutions. Standard contingent valuation models rely on distributional assumptions to provide welfare measures. Deviations from assumed and actual distribution of benefits are important when designing policies in developing countries, where inequality is a concern. This article applies semiparametric methods to obtain estimates of the benefit from a project for the preservation of an important Brazilian river basin. These estimates lead to significant differences from those obtained using the standard parametric approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Microbial xylanolytic enzymes have a promising biotechnological potential, and are extensively applied in industries. In this study, induction of xylanolytic activity was examined in Aspergillus phoenicis. Xylanase activity induced by xylan, xylose or beta-methylxyloside was predominantly extracellular (93-97%). Addition of 1% glucose to media supplemented with xylan or xylose repressed xylanase production. Glucose repression was alleviated by addition of cAMP or dibutyryl-cAMP. These physiological observations were supported by a Northern analysis using part of the xylanase gene ApXLN as a probe. Gene transcription was shown to be induced by xylan, xylose, and beta-methylxyloside, and was repressed by the addition of 1% glucose. Glucose repression was partially relieved by addition of cAMP or dibutyryl cAMP.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes the use of the electrostatic layer-by-layer (LbL) technique for the preparation of bioanodes with potential application in ethanol/O(2) biofuel cells. More specifically, the LbL technique was employed for immobilization of dehydrogenase enzymes and polyamidoamine (PAMAM) dendrimers onto carbon paper support. Both mono (anchoring only the enzyme alcohol dehydrogenase, ADH) and bienzymatic (anchoring both ADH and aldehyde dehydrogenase, AldDH) systems were tested. The amount of ADH deposited onto the Toray (R) paper was 95 ng cm(-2) per bilayer. Kinetic studies revealed that the LbL technique enables better control of enzyme disposition on the bioanode, as compared with the results obtained with the bioanodes prepared by the passive adsorption technique. The power density values achieved for the mono-enzymatic system as a function of the enzyme load ranged from 0.02 to 0.063 mW cm(-2) for the bioanode containing 36 ADH bilayers. The bioanodes containing a gas diffusion layer (GDL) displayed enhanced performance, but their mechanical stability must be improved. The bienzymatic system generated a power density of 0.12 mW cm(-2). In conclusion, the LbL technique is a very attractive approach for enzyme immobilization onto carbon platform, since it enables strict control of enzyme disposition on the bioanode surface with very low enzyme consumption. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We prove that, once an algorithm of perfect simulation for a stationary and ergodic random field F taking values in S(Zd), S a bounded subset of R(n), is provided, the speed of convergence in the mean ergodic theorem occurs exponentially fast for F. Applications from (non-equilibrium) statistical mechanics and interacting particle systems are presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider a kinetic Ising model which represents a generic agent-based model for various types of socio-economic systems. We study the case of a finite (and not necessarily large) number of agents N as well as the asymptotic case when the number of agents tends to infinity. The main ingredient are individual decision thresholds which are either fixed over time (corresponding to quenched disorder in the Ising model, leading to nonlinear deterministic dynamics which are generically non-ergodic) or which may change randomly over time (corresponding to annealed disorder, leading to ergodic dynamics). We address the question how increasing the strength of annealed disorder relative to quenched disorder drives the system from non-ergodic behavior to ergodicity. Mathematically rigorous analysis provides an explicit and detailed picture for arbitrary realizations of the quenched initial thresholds, revealing an intriguing ""jumpy"" transition from non-ergodicity with many absorbing sets to ergodicity. For large N we find a critical strength of annealed randomness, above which the system becomes asymptotically ergodic. Our theoretical results suggests how to drive a system from an undesired socio-economic equilibrium (e. g. high level of corruption) to a desirable one (low level of corruption).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The core structure of the natural sesquiterpene lactones furanoheliangolides, an 11-oxabicyclo[6.2.1]undecane system, was synthesized through a pathway involving two Diels-Alder reactions. (c) 2007 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electrical impedance tomography is a technique to estimate the impedance distribution within a domain, based on measurements on its boundary. In other words, given the mathematical model of the domain, its geometry and boundary conditions, a nonlinear inverse problem of estimating the electric impedance distribution can be solved. Several impedance estimation algorithms have been proposed to solve this problem. In this paper, we present a three-dimensional algorithm, based on the topology optimization method, as an alternative. A sequence of linear programming problems, allowing for constraints, is solved utilizing this method. In each iteration, the finite element method provides the electric potential field within the model of the domain. An electrode model is also proposed (thus, increasing the accuracy of the finite element results). The algorithm is tested using numerically simulated data and also experimental data, and absolute resistivity values are obtained. These results, corresponding to phantoms with two different conductive materials, exhibit relatively well-defined boundaries between them, and show that this is a practical and potentially useful technique to be applied to monitor lung aeration, including the possibility of imaging a pneumothorax.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Treatment of excessive gingival display usually involves procedures such as Le Fort impaction or maxillary gingivectomies. The authors propose an alternative technique that reduces the muscular function of the elevator of the upper lip muscle and repositioning of the upper lip. Methods: Fourteen female patients with excessive gingival exposure were operated on between February of 2008 and March of 2009. They were filmed before and at least 6 months after the procedure. They were asked to perform their fullest smile, and the maximum gingival exposures were measured and analyzed using ImageJ software. Patients were operated on under local anesthesia. Their gingival mucosa was freed from the maxilla using a periosteum elevator. Skin and subcutaneous tissue were dissected bluntly from the underlying musculature of the upper lip. A frenuloplasty was performed to lengthen the upper lip. Both levator labii superioris muscles were dissected and divided. Results: The postoperative course was uneventful in all of the patients. The mean gingival exposure before surgery was 5.22 +/- 1.48 mm; 6 months after surgery, it was 1.91 +/- 1.50 mm. The mean gingival exposure reduction was 3.31 +/- 1.05 mm (p < 0.001), ranging from 1.59 to 4.83 mm. Conclusion: This study shows that the proposed technique was efficient in reducing the amount of exposed gum during smile in all patients in this series. (Plast. Reconstr. Surg. 126: 1014, 2010.)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Stroke mortality rates in Brazil are the highest in the Americas. Deaths from cerebrovascular disease surpass coronary heart disease. Aim To verify stroke mortality rates and morbidity in an area of Sao Paulo, Brazil, using the World Health Organization Stepwise Approach to Stroke Surveillance. Methods We used the World Health Organization Stepwise Approach to Stroke Surveillance structure of stroke surveillance. The hospital-based data comprised fatal and nonfatal stroke (Step 1). We gathered stroke-related mortality data in the community using World Health Organization questionnaires (Step 2). The questionnaire determining stroke prevalence was activated door to door in a family-health-programme neighbourhood (Step 3). Results A total of 682 patients 18 years and above, including 472 incident cases, presented with cerebrovascular disease and were enrolled in Step 1 during April-May 2009. Cerebral infarction (84 center dot 3%) and first-ever stroke (85 center dot 2%) were the most frequent. In Step 2, 256 deaths from stroke were identified during 2006-2007. Forty-four per cent of deaths were classified as unspecified stroke, 1/3 as ischaemic stroke, and 1/4 due to haemorrhagic subtype. In Step 3, 577 subjects over 35 years old were evaluated at home, and 244 cases of stroke survival were diagnosed via a questionnaire, validated by a board-certified neurologist. The population demographic characteristics were similar in the three steps, except in terms of age and gender. Conclusion By including data from all settings, World Health Organization stroke surveillance can provide data to help plan future resources that meet the needs of the public-health system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aim. The aim of this study was to understand the heart transplantation experience based on patients` descriptions. Background. To patients with heart failure, heart transplantation represents a possibility to survive and improve their quality of life. Studies have shown that more quality of life is related to patients` increasing awareness and participation in the work of the healthcare team in the post-transplantation period. Deficient relationships between patients and healthcare providers result in lower compliance with the postoperative regimen. Method. A phenomenological approach was used to interview 26 patients who were heart transplant recipients. Patients were interviewed individually and asked this single question: What does the experience of being heart transplanted mean? Participants` descriptions were analysed using phenomenological reduction, analysis and interpretation. Results. Three categories emerged from data analysis: (i) the time lived by the heart recipient; (ii) donors, family and caregivers and (iii) reflections on the experience lived. Living after heart transplant means living in a complex situation: recipients are confronted with lifelong immunosuppressive therapy associated with many side-effects. Some felt healthy whereas others reported persistence of complications as well as the onset of other pathologies. However, all participants celebrated an improvement in quality of life. Health caregivers, their social and family support had been essential for their struggle. Participants realised that life after heart transplantation was a continuing process demanding support and structured follow-up for the rest of their lives. Conclusion. The findings suggest that each individual has unique experiences of the heart transplantation process. To go on living participants had to accept changes and adapt: to the organ change, to complications resulting from rejection of the organ, to lots of pills and food restrictions. Relevance to clinical practice. Stimulating a heart transplant patients spontaneous expression about what they are experiencing and granting them the actual status of the main character in their own story is important to their care.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The identification, modeling, and analysis of interactions between nodes of neural systems in the human brain have become the aim of interest of many studies in neuroscience. The complex neural network structure and its correlations with brain functions have played a role in all areas of neuroscience, including the comprehension of cognitive and emotional processing. Indeed, understanding how information is stored, retrieved, processed, and transmitted is one of the ultimate challenges in brain research. In this context, in functional neuroimaging, connectivity analysis is a major tool for the exploration and characterization of the information flow between specialized brain regions. In most functional magnetic resonance imaging (fMRI) studies, connectivity analysis is carried out by first selecting regions of interest (ROI) and then calculating an average BOLD time series (across the voxels in each cluster). Some studies have shown that the average may not be a good choice and have suggested, as an alternative, the use of principal component analysis (PCA) to extract the principal eigen-time series from the ROI(s). In this paper, we introduce a novel approach called cluster Granger analysis (CGA) to study connectivity between ROIs. The main aim of this method was to employ multiple eigen-time series in each ROI to avoid temporal information loss during identification of Granger causality. Such information loss is inherent in averaging (e.g., to yield a single ""representative"" time series per ROI). This, in turn, may lead to a lack of power in detecting connections. The proposed approach is based on multivariate statistical analysis and integrates PCA and partial canonical correlation in a framework of Granger causality for clusters (sets) of time series. We also describe an algorithm for statistical significance testing based on bootstrapping. By using Monte Carlo simulations, we show that the proposed approach outperforms conventional Granger causality analysis (i.e., using representative time series extracted by signal averaging or first principal components estimation from ROIs). The usefulness of the CGA approach in real fMRI data is illustrated in an experiment using human faces expressing emotions. With this data set, the proposed approach suggested the presence of significantly more connections between the ROIs than were detected using a single representative time series in each ROI. (c) 2010 Elsevier Inc. All rights reserved.