848 resultados para Intraday volatility


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study provides a versatile validated method to determine the total vitamin C content, as the sum of the contents of L-ascorbic acid (L-AA) and dehydroascorbic acid (DHAA), in several fruits and vegetables and its degradability with storage time. Seven horticultural crops from two different origins were analyzed using an ultrahigh-performance liquid chromatographic–photodiode array (UHPLC-PDA) system, equipped with a new trifunctional high strength silica (100% silica particle) analytical column (100 mm×2.1 mm, 1.7 μm particle size) using 0.1% (v/v) formic acid as mobile phase, in isocratic mode. This new stationary phase, specially designed for polar compounds, overcomes the problems normally encountered in HPLC and is suitable for the analysis of large batches of samples without L-AA degradation. In addition, it proves to be an excellent alternative to conventional C18 columns for the determination of L-AA in fruits and vegetables. The method was fully validated in terms of linearity, detection (LOD) and quantification (LOQ) limits, accuracy, and inter/intraday precision. Validation experiments revealed very good recovery rate of 96.6±4.4% for L-AA and 103.1±4.8 % for total vitamin C, good linearity with r2-values >0.999 within the established concentration range, excellent repeatability (0.5%), and reproducibility (1.6%) values. The LOD of the method was 22 ng/mL whereas the LOQ was 67 ng/mL. It was possible to demonstrate that L-AA and DHAA concentrations in the different horticulture products varied oppositely with time of storage not always affecting the total amount of vitamin C during shelf-life. Locally produced fruits have higher concentrations of vitamin C, compared with imported ones, but vegetables showed the opposite trend. Moreover, this UHPLC-PDA methodology proves to be an improved, simple, and fast approach for determining the total content of vitamin C in various food commodities, with high sensitivity, selectivity, and resolving power within 3 min of run analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The financial crisis that occurred between the years 2007 and 2008, known as the subprime crisis, has highlighted the governance of companies in Brazil and worldwide. To monitor the financial risk, quantitative tools of risk management were created in the 1990s, after several financial disasters. The market turmoil has also led companies to invest in the development and use of information, which are applied as tools to support process control and decision making. Numerous empirical studies on informational efficiency of the market have been made inside and outside Brazil, revealing whether the prices reflect the information available instantly. The creation of different levels of corporate governance on BOVESPA, in 2000, made the firms had greater impairment in relation to its shareholders with greater transparency in their information. The purpose of this study is to analyze how the subprime financial crisis has affected, between January 2007 and December 2009, the volatility of stock returns in the BM&BOVESPA of companies with greater liquidity at different levels of corporate governance. From studies of time series and through the studies of events, econometric tests were performed by the EVIEWS, and through the results obtained it became evident that the adoption of good practices of corporate governance affect the volatility of returns of companies

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present paper has the purpose of investigate the dynamics of the volatility structure in the shrimp prices in the Brazilian fish market. Therefore, a description of the initial aspects of the shrimp price series was made. From this information, statistics tests were made and selected univariate models to be price predictors. Then, it was verified the existence of relationship of long-term equilibrium between the Brazilian and American imported shrimp and if, confirmed the relationship, whether or not there is a causal link between these assets, considering that the two countries had presented trade relations over the years. It is presented as an exploratory research of applied nature with quantitative approach. The database was collected through direct contact with the Companhia de Entrepostos e Armazéns Gerais de São Paulo (CEAGESP) and on the official website of American import, National Marine Fisheries Service - National Oceanic and Atmospheric Administration (NMFS- NOAA). The results showed that the great variability in the active price is directly related with the gain and loss of the market agents. The price series presents a strong seasonal and biannual effect. The average structure of price of shrimp in the last 12 years was R$ 11.58 and external factors besides the production and marketing (U.S. antidumping, floods and pathologies) strongly affected the prices. Among the tested models for predicting prices of shrimp, four were selected, which through the prediction methodologies of one step forward of horizon 12, proved to be statistically more robust. It was found that there is weak evidence of long-term equilibrium between the Brazilian and American shrimp, where equivalently, was not found a causal link between them. We concluded that the dynamic pricing of commodity shrimp is strongly influenced by external productive factors and that these phenomena cause seasonal effects in the prices. There is no relationship of long-term stability between the Brazilian and American shrimp prices, but it is known that Brazil imports USA production inputs, which somehow shows some dependence productive. To the market agents, the risk of interferences of the external prices cointegrated to Brazilian is practically inexistent. Through statistical modeling is possible to minimize the risk and uncertainty embedded in the fish market, thus, the sales and marketing strategies for the Brazilian shrimp can be consolidated and widespread

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Internet applications such as media streaming, collaborative computing and massive multiplayer are on the rise,. This leads to the need for multicast communication, but unfortunately group communications support based on IP multicast has not been widely adopted due to a combination of technical and non-technical problems. Therefore, a number of different application-layer multicast schemes have been proposed in recent literature to overcome the drawbacks. In addition, these applications often behave as both providers and clients of services, being called peer-topeer applications, and where participants come and go very dynamically. Thus, servercentric architectures for membership management have well-known problems related to scalability and fault-tolerance, and even peer-to-peer traditional solutions need to have some mechanism that takes into account member's volatility. The idea of location awareness distributes the participants in the overlay network according to their proximity in the underlying network allowing a better performance. Given this context, this thesis proposes an application layer multicast protocol, called LAALM, which takes into account the actual network topology in the assembly process of the overlay network. The membership algorithm uses a new metric, IPXY, to provide location awareness through the processing of local information, and it was implemented using a distributed shared and bi-directional tree. The algorithm also has a sub-optimal heuristic to minimize the cost of membership process. The protocol has been evaluated in two ways. First, through an own simulator developed in this work, where we evaluated the quality of distribution tree by metrics such as outdegree and path length. Second, reallife scenarios were built in the ns-3 network simulator where we evaluated the network protocol performance by metrics such as stress, stretch, time to first packet and reconfiguration group time

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Anhydrous ethanol is used in chemical, pharmaceutical and fuel industries. However, current processes for obtaining it involve high cost, high energy demand and use of toxic and pollutant solvents. This problem occurs due to the formation of an azeotropic mixture of ethanol + water, which does not allow the complete separation by conventional methods such as simple distillation. As an alternative to currently used processes, this study proposes the use of ionic liquids as solvents in extractive distillation. These are organic salts which are liquids at low temperatures (under 373,15 K). They exhibit characteristics such as low volatility (almost zero/ low vapor ), thermal stability and low corrosiveness, which make them interesting for applications such as catalysts and as entrainers. In this work, experimental data for the vapor pressure of pure ethanol and water in the pressure range of 20 to 101 kPa were obtained as well as for vapor-liquid equilibrium (VLE) of the system ethanol + water at atmospheric pressure; and equilibrium data of ethanol + water + 2-HDEAA (2- hydroxydiethanolamine acetate) at strategic points in the diagram. The device used for these experiments was the Fischer ebulliometer, together with density measurements to determine phase compositions. The experimental data were consistent with literature data and presented thermodynamic consistency, thus the methodology was properly validated. The results were favorable, with the increase of ethanol concentration in the vapor phase, but the increase was not shown to be pronounced. The predictive model COSMO-SAC (COnductor-like Screening MOdels Segment Activity Coefficient) proposed by Lin & Sandler (2002) was studied for calculations to predict vapor-liquid equilibrium of systems ethanol + water + ionic liquids at atmospheric pressure. This is an alternative for predicting phase equilibrium, especially for substances of recent interest, such as ionic liquids. This is so because no experimental data nor any parameters of functional groups (as in the UNIFAC method) are needed

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A sensitive, precise, and specific high-performance liquid chromatographic (HPLC) method was developed for the assay of gatifloxacin (GATX) in raw material and tablets. The method validation parameters yielded good results and included the range, linearity, precision, accuracy, specificity, and recovery. It was also found that the excipients in the commercial tablet preparation did not interfere with the assay. The HPLC separation was carried out by reversed-phase chromatography on a C18 absorbosphere column (250 x 4.6 mm id, 5 pm particle size) with a mobile phase composed of acetic acid 50/o--acetonitrile-methanol (70 + 15 + 15, v/v/v) pumped isocratically at a flow rate of 1.0 mL/min. The effluent was monitored at 287 nm. The calibration graph for GATX was linear from 4.0 to 14.0 mu g/mL. The interday and intraday precisions (relative standard deviation) were less than 1.05%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A sensitive, precise, and specific high-performance liquid chromatography (HPLC) method was developed for the assay of lomefloxacin (LFLX) in raw material and tablet preparations. The method validation parameters yielded good results and included the range, linearity, precision, accuracy, specificity, and recovery. It was also found that the excipients in the commercial tablet preparation did not interfere with the assay. The HPLC separation was performed on a reversed-phase Phenomenex C18 column (150 x 4.6 mm id, 5 pm particle size) with a mobile phase composed of 1% acetic acid-acetonitrile-methanol (70 + 15 + 15, v/v/v), pumped isocratically at a flow rate of 1.0 mL/min. The effluent was monitored at 280 nm. The calibration graph for LFLX was linear from 2.0 to 7.0 mg/mL. The interday and intraday precisions (relative standard deviation) were less than 1.0%. The method was applied for the quality control of commercial LFLX tablets to quantitate the drug.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A rapid, accurate, and sensitive high-performance liquid chromatographic (HPLC) method was developed and validated for the determination of ceftazidime in pharmaceuticals. The method validation parameters yielded good results and included range, linearity, precision, accuracy, specificity, and recovery. The excipients in the commercial powder for injection did not interfere with the assay. Reversed-phase chromatography was used for the HPLC separation on a Waters C18 (WAT 054275; Milford, MA) column with methanol-water (70 + 30, v/v) as the mobile phase pumped isocratically at a flow rate of 1.0 mL/min. The effluent was monitored at 245 nm. The calibration graph for ceftazidime was linear from 50.0 to 300.0 mu g/mL. The values for interday and intraday precision (relative standard deviation) were < 1 %. The results obtained by the HPLC method were calculated statistically by analysis of variance. We concluded that the HPLC method is satisfactory for the determination of ceftazidime in the raw material and pharmaceuticals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is known that sleep plays an important role in the process of motor learning. Recent studies have shown that the presence of sleep between training a motor task and retention test promotes a learning task so than the presence of only awake between training and testing. These findings also have been reported in stroke patients, however, there are few studies that investigate the results of this relationship on the functionality itself in this population. The objective of this study was to evaluate the relationship between functionality and sleep in patients in the chronic stage of stroke. A cross-sectional observational study was conducted. The sample was composed of 30 stroke individuals in chronic phase, between 6 and 60 months after injury and aged between 55 and 75 years. The volunteers were initially evaluated for clinical data of disease and personal history, severity of stroke, through the National Institute of Health Stroke Scale, and mental status, the Mini-Mental State Examination. Sleep assessment tools were Pittsburgh Sleep Quality Index, the Questionnaire of Horne and Ostberg, Epworth Sleepiness Scale, the Berlin questionnaire and actigraphy, which measures were: real time of sleep, waking after sleep onset, percentage of waking after sleep onset, sleep efficiency, sleep latency, sleep fragmentation index, mean activity score. Other actigraphy measures were intraday variability, stability interdiária, a 5-hour period with minimum level of activity (L5) and 10-hour period with maximum activity (M10), obtained to evaluate the activity-rest rhythm. The Functional Independence Measure (FIM) and the Berg Balance Scale (BBS) were the instruments used to evaluate the functional status of participants. The Spearman correlation coefficient and comparison tests (Student's t and Mann-Whitney) were used to analyze the relationship of sleep assessment tools and rest-activity rhythm to measures of functional assessment. The SPSS 16.0 was used for analysis, adopting a significance level of 5%. The main results observed were a negative correlation between sleepiness and balance and a negative correlation between the level of activity (M10) and sleep fragmentation. No measurement of sleep or rhythm was associated with functional independence measure. These findings suggest that there may be an association between sleepiness and xii balance in patients in the chronic stage of stroke, and that obtaining a higher level of activity may be associated with a better sleep pattern and rhythm more stable and less fragmented. Future studies should evaluate the cause-effect relationship between these parameters

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study aims to understand the experience of people suffering from mental disorder. The patients are enrolled in a mental health ambulatory clinic in the city of Natal (RN). Mental disorders are growing rapidly in the contemporary world and are a source of intense mental suffering. Besides patients being strongly marked by a history of isolation and prejudice, they have been the target of real atrocities committed in the name of preservation of a supposed normality. The understanding and treatment of this disorder is influenced by cultural and historical inferences, depending on the period in which it is experienced. Semi-directed Interviews were conducted with a group of users, with the emphasis on giving voice to their uniqueness and individuality, highlighting how each one perceives his or her own experience. These were recorded and later transcribed by identifying the core of meanings. The results were analyzed under the gaze of the Humanist Phenomenology Existential perspective, which aims to unravel the phenomenon, without truths from volatility, highlighting the existence of the mental disorder as a way of living, being permeated by suffering mental and influenced by social problems, assuming contours very particular to each individual. Some progress has been perceived, even by users, with respect to the change of paradigm in the way of care, but still there is a consistent emphasis on medical and drug use. The changes point to the need for offering services to replace the asylum hospital model, and in addition to accept the bearer of mental disorder as a citizen, a bearer of rights who should be accepted and respected by society. Despite the pain expressed and its close liaison with suicide, their reports are full of perspectives and attitudes of confrontation facing life, pointing to new possibilities to be, recreating itself

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Researches in Requirements Engineering have been growing in the latest few years. Researchers are concerned with a set of open issues such as: communication between several user profiles involved in software engineering; scope definition; volatility and traceability issues. To cope with these issues a set of works are concentrated in (i) defining processes to collect client s specifications in order to solve scope issues; (ii) defining models to represent requirements to address communication and traceability issues; and (iii) working on mechanisms and processes to be applied to requirements modeling in order to facilitate requirements evolution and maintenance, addressing volatility and traceability issues. We propose an iterative Model-Driven process to solve these issues, based on a double layered CIM to communicate requirements related knowledge to a wider amount of stakeholders. We also present a tool to help requirements engineer through the RE process. Finally we present a case study to illustrate the process and tool s benefits and usage

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cyclodextrins ( CDs) are cyclic oligasaccharides composed by D- glucose monomers joined by alpha- 1,4-D glicosidic linkages. The main types of CDs are alpha-,beta-and gamma-CDs consisting of cycles of six, seven, and eight glucose monomers, respectively. Their ability to form inclusion complexes is the most important characteristic, allowing their wide industrial application. The physical property of the CD-complexed compound can be altered to improve stability, volatility, solubility, or bio-availability. The cyclomaltodextrin glucanotransferase ( CGTase, EC 2.4.1.19) is an enzyme capable of converting starch into CD molecules. In this work, the CGTase produced by Bacillus clausii strain E16 was used to produce CD from maltodextrin and different starches ( commercial soluble starch, corn, cassava, sweet potato, and waxy corn starches) as substrates. It was observed that the substrate sources influence the kind of CD obtained and that this CGTase displays a beta- CGTase action, presenting a better conversion of soluble starch at 1.0%, of which 80% was converted in CDs. The ratio of total CD produced was 0: 0.89: 0.11 for alpha/beta/gamma. It was also observed that root and tuber starches were more accessible to CGTase action than seed starch under the studied conditions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is a well-developed framework, the Black-Scholes theory, for the pricing of contracts based on the future prices of certain assets, called options. This theory assumes that the probability distribution of the returns of the underlying asset is a Gaussian distribution. However, it is observed in the market that this hypothesis is flawed, leading to the introduction of a fudge factor, the so-called volatility smile. Therefore, it would be interesting to explore extensions of the Black-Scholes theory to non-Gaussian distributions. In this paper, we provide an explicit formula for the price of an option when the distributions of the returns of the underlying asset is parametrized by an Edgeworth expansion, which allows for the introduction of higher independent moments of the probability distribution, namely skewness and kurtosis. We test our formula with options in the Brazilian and American markets, showing that the volatility smile can be reduced. We also check whether our approach leads to more efficient hedging strategies of these instruments. (C) 2004 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we study the possible microscopic origin of heavy-tailed probability density distributions for the price variation of financial instruments. We extend the standard log-normal process to include another random component in the so-called stochastic volatility models. We study these models under an assumption, akin to the Born-Oppenheimer approximation, in which the volatility has already relaxed to its equilibrium distribution and acts as a background to the evolution of the price process. In this approximation, we show that all models of stochastic volatility should exhibit a scaling relation in the time lag of zero-drift modified log-returns. We verify that the Dow-Jones Industrial Average index indeed follows this scaling. We then focus on two popular stochastic volatility models, the Heston and Hull-White models. In particular, we show that in the Hull-White model the resulting probability distribution of log-returns in this approximation corresponds to the Tsallis (t-Student) distribution. The Tsallis parameters are given in terms of the microscopic stochastic volatility model. Finally, we show that the log-returns for 30 years Dow Jones index data is well fitted by a Tsallis distribution, obtaining the relevant parameters. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate the Heston model with stochastic volatility and exponential tails as a model for the typical price fluctuations of the Brazilian São Paulo Stock Exchange Index (IBOVESPA). Raw prices are first corrected for inflation and a period spanning 15 years characterized by memoryless returns is chosen for the analysis. Model parameters are estimated by observing volatility scaling and correlation properties. We show that the Heston model with at least two time scales for the volatility mean reverting dynamics satisfactorily describes price fluctuations ranging from time scales larger than 20min to 160 days. At time scales shorter than 20 min we observe autocorrelated returns and power law tails incompatible with the Heston model. Despite major regulatory changes, hyperinflation and currency crises experienced by the Brazilian market in the period studied, the general success of the description provided may be regarded as an evidence for a general underlying dynamics of price fluctuations at intermediate mesoeconomic time scales well approximated by the Heston model. We also notice that the connection between the Heston model and Ehrenfest urn models could be exploited for bringing new insights into the microeconomic market mechanics. (c) 2005 Elsevier B.V. All rights reserved.