915 resultados para Conditional Directed Graph


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We develop a new autoregressive conditional process to capture both the changes and the persistency of the intraday seasonal (U-shape) pattern of volatility in essay 1. Unlike other procedures, this approach allows for the intraday volatility pattern to change over time without the filtering process injecting a spurious pattern of noise into the filtered series. We show that prior deterministic filtering procedures are special cases of the autoregressive conditional filtering process presented here. Lagrange multiplier tests prove that the stochastic seasonal variance component is statistically significant. Specification tests using the correlogram and cross-spectral analyses prove the reliability of the autoregressive conditional filtering process. In essay 2 we develop a new methodology to decompose return variance in order to examine the informativeness embedded in the return series. The variance is decomposed into the information arrival component and the noise factor component. This decomposition methodology differs from previous studies in that both the informational variance and the noise variance are time-varying. Furthermore, the covariance of the informational component and the noisy component is no longer restricted to be zero. The resultant measure of price informativeness is defined as the informational variance divided by the total variance of the returns. The noisy rational expectations model predicts that uninformed traders react to price changes more than informed traders, since uninformed traders cannot distinguish between price changes caused by information arrivals and price changes caused by noise. This hypothesis is tested in essay 3 using intraday data with the intraday seasonal volatility component removed, as based on the procedure in the first essay. The resultant seasonally adjusted variance series is decomposed into components caused by unexpected information arrivals and by noise in order to examine informativeness.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Newfoundland and Labrador has a high incidence of type 1 diabetes and diabetic ketoacidosis (DKA) is a complication of type 1 diabetes. A clinical practice guideline was developed for the treatment of pediatric diabetic ketoacidosis (DKA) to standardize care in all Emergency Departments and improve patient outcomes. Rural emergency nurses are requires to maintain their competency and acquire new knowledge as stated by the Association of Registered Nurses of Newfoundland and Labrador (ARNNL). Purpose: The purpose of this practicum was to develop a self-learning module for rural emergency nurses to increase their knowledge and understanding of the clinical practise guideline to assess, treat, and prevent pediatric ketoacidosis. Methods: Two methodologies were used in this practicum. A review of the literature and consultations with key stakeholders were completed. Results: The self-learning module created was composed of three units and focused on the learning needs of rural emergency nurses in the areas of assessment, treatment, and prevention of pediatric DKA. Conclusion: The goal of the practicum was to increase rural emergency nurses’ knowledge and implementation of the clinical practice guideline when assessing and treating children and families experiencing DKA to improve patient outcomes. A planned evaluation of the self-learning module will be conducted following dissemination of the module throughout the rural Emergency Departments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

L'ambiente di questa tesi è quello del Delay and Disruption Tolerant Networks (DTN), un'architettura di rete di telecomunicazioni avente come obiettivo le comunicazioni tra nodi di reti dette “challenged”, le quali devono affrontare problemi come tempi di propagazione elevati, alto tasso di errore e periodi di perdita delle connessioni. Il Bunde layer, un nuovo livello inserito tra trasporto e applicazione nell’architettura ISO/OSI, ed il protocollo ad esso associato, il Bundle Protocol (BP), sono stati progettati per rendere possibili le comunicazioni in queste reti. A volte fra la ricezione e l’invio può trascorrere un lungo periodo di tempo, a causa della indisponibilità del collegamento successivo; in questo periodo il bundle resta memorizzato in un database locale. Esistono varie implementazioni dell'architettura DTN come DTN2, implementazione di riferimento, e ION (Interplanetary Overlay Network), sviluppata da NASA JPL, per utilizzo in applicazioni spaziali; in esse i contatti tra i nodi sono deterministici, a differenza delle reti terrestri nelle quali i contatti sono generalmente opportunistici (non noti a priori). Per questo motivo all’interno di ION è presente un algoritmo di routing, detto CGR (Contact Graph Routing), progettato per operare in ambienti con connettività deterministica. È in fase di ricerca un algoritmo che opera in ambienti non deterministici, OCGR (Opportunistic Contact Graph Routing), che estende CGR. L’obiettivo di questa tesi è quello di fornire una descrizione dettagliata del funzionamento di OCGR, partendo necessariamente da CGR sul quale è basato, eseguire dei test preliminari, richiesti da NASA JPL, ed analizzarne i risultati per verificare la possibilità di utilizzo e miglioramento dell’algoritmo. Sarà inoltre descritto l’ambiente DTN e i principali algoritmi di routing per ambienti opportunistici. Nella parte conclusiva sarà presentato il simulatore DTN “The ONE” e l’integrazione di CGR e OCGR al suo interno.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Omnibus tests of significance in contingency tables use statistics of the chi-square type. When the null is rejected, residual analyses are conducted to identify cells in which observed frequencies differ significantly from expected frequencies. Residual analyses are thus conditioned on a significant omnibus test. Conditional approaches have been shown to substantially alter type I error rates in cases involving t tests conditional on the results of a test of equality of variances, or tests of regression coefficients conditional on the results of tests of heteroscedasticity. We show that residual analyses conditional on a significant omnibus test are also affected by this problem, yielding type I error rates that can be up to 6 times larger than nominal rates, depending on the size of the table and the form of the marginal distributions. We explored several unconditional approaches in search for a method that maintains the nominal type I error rate and found out that a bootstrap correction for multiple testing achieved this goal. The validity of this approach is documented for two-way contingency tables in the contexts of tests of independence, tests of homogeneity, and fitting psychometric functions. Computer code in MATLAB and R to conduct these analyses is provided as Supplementary Material.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Esta tesis doctoral nace con el propósito de entender, analizar y sobre todo modelizar el comportamiento estadístico de las series financieras. En este sentido, se puede afirmar que los modelos que mejor recogen las especiales características de estas series son los modelos de heterocedasticidad condicionada en tiempo discreto,si los intervalos de tiempo en los que se recogen los datos lo permiten, y en tiempo continuo si tenemos datos diarios o datos intradía. Con esta finalidad, en esta tesis se proponen distintos estimadores bayesianos para la estimación de los parámetros de los modelos GARCH en tiempo discreto (Bollerslev (1986)) y COGARCH en tiempo continuo (Kluppelberg et al. (2004)). En el capítulo 1 se introducen las características de las series financieras y se presentan los modelos ARCH, GARCH y COGARCH, así como sus principales propiedades. Mandelbrot (1963) destacó que las series financieras no presentan estacionariedad y que sus incrementos no presentan autocorrelación, aunque sus cuadrados sí están correlacionados. Señaló también que la volatilidad que presentan no es constante y que aparecen clusters de volatilidad. Observó la falta de normalidad de las series financieras, debida principalmente a su comportamiento leptocúrtico, y también destacó los efectos estacionales que presentan las series, analizando como se ven afectadas por la época del año o el día de la semana. Posteriormente Black (1976) completó la lista de características especiales incluyendo los denominados leverage effects relacionados con como las fluctuaciones positivas y negativas de los precios de los activos afectan a la volatilidad de las series de forma distinta.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mémoire numérisé par la Direction des bibliothèques de l'Université de Montréal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Intermittent exporting is something of a puzzle. In theory, exporting represents a major commitment, and is often the starting point for further internationalisation. However, intermittent exporters exit and subsequently re-enter exporting, sometimes frequently. We develop a conceptual model to explain how firm characteristics and market conditions interact to affect the decision to exit and re-enter exporting, and model this process using an extensive dataset of French manufacturing firms from 1997 to 2007. As anticipated, smaller and less productive firms are more likely to exit exporting, and react more strongly to changes in both domestic and foreign markets than larger firms. Exit and re-entry are closely linked. Firms with a low exit probability also have a high likelihood of re-entry, and vice versa. However, the way in which firms react to market conditions at the time of exit matters greatly in determining the likelihood of re-entry: thus re-entry depends crucially on the strategic rationale for exit. Our analysis helps explain the opportunistic and intermittent exporting of (mainly) small firms, the demand conditions under which intermittent exporting is most likely to occur, and the firm attributes most likely to give rise to such behavior.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

© 2014, Canadian Anesthesiologists' Society.Optimal perioperative fluid management is an important component of Enhanced Recovery After Surgery (ERAS) pathways. Fluid management within ERAS should be viewed as a continuum through the preoperative, intraoperative, and postoperative phases. Each phase is important for improving patient outcomes, and suboptimal care in one phase can undermine best practice within the rest of the ERAS pathway. The goal of preoperative fluid management is for the patient to arrive in the operating room in a hydrated and euvolemic state. To achieve this, prolonged fasting is not recommended, and routine mechanical bowel preparation should be avoided. Patients should be encouraged to ingest a clear carbohydrate drink two to three hours before surgery. The goals of intraoperative fluid management are to maintain central euvolemia and to avoid excess salt and water. To achieve this, patients undergoing surgery within an enhanced recovery protocol should have an individualized fluid management plan. As part of this plan, excess crystalloid should be avoided in all patients. For low-risk patients undergoing low-risk surgery, a “zero-balance” approach might be sufficient. In addition, for most patients undergoing major surgery, individualized goal-directed fluid therapy (GDFT) is recommended. Ultimately, however, the additional benefit of GDFT should be determined based on surgical and patient risk factors. Postoperatively, once fluid intake is established, intravenous fluid administration can be discontinued and restarted only if clinically indicated. In the absence of other concerns, detrimental postoperative fluid overload is not justified and “permissive oliguria” could be tolerated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Copyright © 2014 International Anesthesia Research Society.BACKGROUND: Goal-directed fluid therapy (GDFT) is associated with improved outcomes after surgery. The esophageal Doppler monitor (EDM) is widely used, but has several limitations. The NICOM, a completely noninvasive cardiac output monitor (Cheetah Medical), may be appropriate for guiding GDFT. No prospective studies have compared the NICOM and the EDM. We hypothesized that the NICOM is not significantly different from the EDM for monitoring during GDFT. METHODS: One hundred adult patients undergoing elective colorectal surgery participated in this study. Patients in phase I (n = 50) had intraoperative GDFT guided by the EDM while the NICOM was connected, and patients in phase II (n = 50) had intraoperative GDFT guided by the NICOM while the EDM was connected. Each patient's stroke volume was optimized using 250- mL colloid boluses. Agreement between the monitors was assessed, and patient outcomes (postoperative pain, nausea, and return of bowel function), complications (renal, pulmonary, infectious, and wound complications), and length of hospital stay (LOS) were compared. RESULTS: Using a 10% increase in stroke volume after fluid challenge, agreement between monitors was 60% at 5 minutes, 61% at 10 minutes, and 66% at 15 minutes, with no significant systematic disagreement (McNemar P > 0.05) at any time point. The EDM had significantly more missing data than the NICOM. No clinically significant differences were found in total LOS or other outcomes. The mean LOS was 6.56 ± 4.32 days in phase I and 6.07 ± 2.85 days in phase II, and 95% confidence limits for the difference were -0.96 to +1.95 days (P = 0.5016). CONCLUSIONS: The NICOM performs similarly to the EDM in guiding GDFT, with no clinically significant differences in outcomes, and offers increased ease of use as well as fewer missing data points. The NICOM may be a viable alternative monitor to guide GDFT.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mémoire numérisé par la Direction des bibliothèques de l'Université de Montréal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Marine organisms have to cope with increasing CO2 partial pressures and decreasing pH in the oceans. We elucidated the impacts of an 8-week acclimation period to four seawater pCO2 treatments (39, 113, 243 and 405 Pa/385, 1,120, 2,400 and 4,000 µatm) on mantle gene expression patterns in the blue mussel Mytilus edulis from the Baltic Sea. Based on the M. edulis mantle tissue transcriptome, the expression of several genes involved in metabolism, calcification and stress responses was assessed in the outer (marginal and pallial zone) and the inner mantle tissues (central zone) using quantitative real-time PCR. The expression of genes involved in energy and protein metabolism (F-ATPase, hexokinase and elongation factor alpha) was strongly affected by acclimation to moderately elevated CO2 partial pressures. Expression of a chitinase, potentially important for the calcification process, was strongly depressed (maximum ninefold), correlating with a linear decrease in shell growth observed in the experimental animals. Interestingly, shell matrix protein candidate genes were less affected by CO2 in both tissues. A compensatory process toward enhanced shell protection is indicated by a massive increase in the expression of tyrosinase, a gene involved in periostracum formation (maximum 220-fold). Using correlation matrices and a force-directed layout network graph, we were able to uncover possible underlying regulatory networks and the connections between different pathways, thereby providing a molecular basis of observed changes in animal physiology in response to ocean acidification.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Graph analytics is an important and computationally demanding class of data analytics. It is essential to balance scalability, ease-of-use and high performance in large scale graph analytics. As such, it is necessary to hide the complexity of parallelism, data distribution and memory locality behind an abstract interface. The aim of this work is to build a scalable graph analytics framework that does not demand significant parallel programming experience based on NUMA-awareness.
The realization of such a system faces two key problems:
(i)~how to develop a scale-free parallel programming framework that scales efficiently across NUMA domains; (ii)~how to efficiently apply graph partitioning in order to create separate and largely independent work items that can be distributed among threads.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traditional heuristic approaches to the Examination Timetabling Problem normally utilize a stochastic method during Optimization for the selection of the next examination to be considered for timetabling within the neighbourhood search process. This paper presents a technique whereby the stochastic method has been augmented with information from a weighted list gathered during the initial adaptive construction phase, with the purpose of intelligently directing examination selection. In addition, a Reinforcement Learning technique has been adapted to identify the most effective portions of the weighted list in terms of facilitating the greatest potential for overall solution improvement. The technique is tested against the 2007 International Timetabling Competition datasets with solutions generated within a time frame specified by the competition organizers. The results generated are better than those of the competition winner in seven of the twelve examinations, while being competitive for the remaining five examinations. This paper also shows experimentally how using reinforcement learning has improved upon our previous technique.