989 resultados para Timing analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Työn tarkoituksena oli kerätä käyttövarmuustietoa savukaasulinjasta kahdelta suomalaiselta sellutehtaalta niiden käyttöönotosta aina tähän päivään asti. Käyttövarmuustieto koostuu luotettavuustiedoista sekä kunnossapitotiedoista. Kerätyn tiedon avulla on mahdollista kuvata tarkasti laitoksen käyttövarmuutta seuraavilla tunnusluvuilla: suunnittelemattomien häiriöiden lukumäärä ja korjausajat, laitteiden seisokkiaika, vikojen todennäköisyys ja korjaavan kunnossapidon kustannukset suhteessa savukaasulinjan korjaavan kunnossapidon kokonaiskustannuksiin. Käyttövarmuustiedon keräysmetodi on esitelty. Savukaasulinjan kriittisten laitteiden määrittelyyn käytetty metodi on yhdistelmä kyselytutkimuksesta ja muunnellusta vian vaikutus- ja kriittisyysanalyysistä. Laitteiden valitsemiskriteerit lopulliseen kriittisyysanalyysiin päätettiin käyttövarmuustietojen sekä kyselytutkimuksen perusteella. Kriittisten laitteiden määrittämisen tarkoitus on löytää savukaasulinjasta ne laitteet, joiden odottamaton vikaantuminen aiheuttaa vakavimmat seuraukset savukaasulinjan luotettavuuteen, tuotantoon, turvallisuuteen, päästöihin ja kustannuksiin. Tiedon avulla rajoitetut kunnossapidon resurssit voidaan suunnata oikein. Kriittisten laitteiden määrittämisen tuloksena todetaan, että kolme kriittisintä laitetta savukaasulinjassa ovat molemmille sellutehtaille yhteisesti: savukaasupuhaltimet, laahakuljettimet sekä ketjukuljettimet. Käyttövarmuustieto osoittaa, että laitteiden luotettavuus on tehdaskohtaista, mutta periaatteessa samat päälinjat voidaan nähdä suunnittelemattomien vikojen todennäköisyyttä esittävissä kuvissa. Kustannukset, jotka esitetään laitteen suunnittelemattomien kunnossapitokustannusten suhteena savukaasulinjan kokonaiskustannuksiin, noudattelevat hyvin pitkälle luotettavuuskäyrää, joka on laskettu laitteen seisokkiajan suhteena käyttötunteihin. Käyttövarmuustiedon keräys yhdistettynä kriittisten laitteiden määrittämiseen mahdollistavat ennakoivan kunnossapidon oikean kohdistamisen ja ajoittamisen laitteiston elinaikana siten, että luotettavuus- ja kustannustehokkuusvaatimukset saavutetaan.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: A previous individual patient data meta-analysis by the Meta-Analysis of Chemotherapy in Nasopharynx Carcinoma (MAC-NPC) collaborative group to assess the addition of chemotherapy to radiotherapy showed that it improves overall survival in nasopharyngeal carcinoma. This benefit was restricted to patients receiving concomitant chemotherapy and radiotherapy. The aim of this study was to update the meta-analysis, include recent trials, and to analyse separately the benefit of concomitant plus adjuvant chemotherapy. METHODS: We searched PubMed, Web of Science, Cochrane Controlled Trials meta-register, ClinicalTrials.gov, and meeting proceedings to identify published or unpublished randomised trials assessing radiotherapy with or without chemotherapy in patients with non-metastatic nasopharyngeal carcinoma and obtained updated data for previously analysed studies. The primary endpoint of interest was overall survival. All trial results were combined and analysed using a fixed-effects model. The statistical analysis plan was pre-specified in a protocol. All data were analysed on an intention-to-treat basis. FINDINGS: We analysed data from 19 trials and 4806 patients. Median follow-up was 7·7 years (IQR 6·2-11·9). We found that the addition of chemotherapy to radiotherapy significantly improved overall survival (hazard ratio [HR] 0·79, 95% CI 0·73-0·86, p<0·0001; absolute benefit at 5 years 6·3%, 95% CI 3·5-9·1). The interaction between treatment effect (benefit of chemotherapy) on overall survival and the timing of chemotherapy was significant (p=0·01) in favour of concomitant plus adjuvant chemotherapy (HR 0·65, 0·56-0·76) and concomitant without adjuvant chemotherapy (0·80, 0·70-0·93) but not adjuvant chemotherapy alone (0·87, 0·68-1·12) or induction chemotherapy alone (0·96, 0·80-1·16). The benefit of the addition of chemotherapy was consistent for all endpoints analysed (all p<0·0001): progression-free survival (HR 0·75, 95% CI 0·69-0·81), locoregional control (0·73, 0·64-0·83), distant control (0·67, 0·59-0·75), and cancer mortality (0·76, 0·69-0·84). INTERPRETATION: Our results confirm that the addition of concomitant chemotherapy to radiotherapy significantly improves survival in patients with locoregionally advanced nasopharyngeal carcinoma. To our knowledge, this is the first analysis that examines the effect of concomitant chemotherapy with and without adjuvant chemotherapy as distinct groups. Further studies on the specific benefits of adjuvant chemotherapy after concomitant chemoradiotherapy are needed. FUNDING: French Ministry of Health (Programme d'actions intégrées de recherche VADS), Ligue Nationale Contre le Cancer, and Sanofi-Aventis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE: The best time for administering anticoagulation therapy in acute cardioembolic stroke remains unclear. This prospective cohort study of patients with acute stroke and atrial fibrillation, evaluated (1) the risk of recurrent ischemic event and severe bleeding; (2) the risk factors for recurrence and bleeding; and (3) the risks of recurrence and bleeding associated with anticoagulant therapy and its starting time after the acute stroke. METHODS: The primary outcome of this multicenter study was the composite of stroke, transient ischemic attack, symptomatic systemic embolism, symptomatic cerebral bleeding and major extracranial bleeding within 90 days from acute stroke. RESULTS: Of the 1029 patients enrolled, 123 had 128 events (12.6%): 77 (7.6%) ischemic stroke or transient ischemic attack or systemic embolism, 37 (3.6%) symptomatic cerebral bleeding, and 14 (1.4%) major extracranial bleeding. At 90 days, 50% of the patients were either deceased or disabled (modified Rankin score ≥3), and 10.9% were deceased. High CHA2DS2-VASc score, high National Institutes of Health Stroke Scale, large ischemic lesion and type of anticoagulant were predictive factors for primary study outcome. At adjusted Cox regression analysis, initiating anticoagulants 4 to 14 days from stroke onset was associated with a significant reduction in primary study outcome, compared with initiating treatment before 4 or after 14 days: hazard ratio 0.53 (95% confidence interval 0.30-0.93). About 7% of the patients treated with oral anticoagulants alone had an outcome event compared with 16.8% and 12.3% of the patients treated with low molecular weight heparins alone or followed by oral anticoagulants, respectively (P=0.003). CONCLUSIONS: Acute stroke in atrial fibrillation patients is associated with high rates of ischemic recurrence and major bleeding at 90 days. This study has observed that high CHA2DS2-VASc score, high National Institutes of Health Stroke Scale, large ischemic lesions, and type of anticoagulant administered each independently led to a greater risk of recurrence and bleedings. Also, data showed that the best time for initiating anticoagulation treatment for secondary stroke prevention is 4 to 14 days from stroke onset. Moreover, patients treated with oral anticoagulants alone had better outcomes compared with patients treated with low molecular weight heparins alone or before oral anticoagulants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Cherenkov light flashes produced by Extensive Air Showers are very short in time. A high bandwidth and fast digitizing readout, therefore, can minimize the influence of the background from the light of the night sky, and improve the performance in Cherenkov telescopes. The time structure of the Cherenkov image can further be used in single-dish Cherenkov telescopes as an additional parameter to reduce the background from unwanted hadronic showers. A description of an analysis method which makes use of the time information and the subsequent improvement on the performance of the MAGIC telescope (especially after the upgrade with an ultra fast 2 GSamples/s digitization system in February 2007) will be presented. The use of timing information in the analysis of the new MAGIC data reduces the background by a factor two, which in turn results in an enhancement of about a factor 1.4 of the flux sensitivity to point-like sources, as tested on observations of the Crab Nebula.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Fed model is a widely used market valuation model. It is often used only on market analysis of the S&P 500 index as a shorthand measure for the attractiveness of equity, and as a timing device for allocating funds between equity and bonds. The Fed model assumes a fixed relationship between bond yield and earnings yield. This relationship is often assumed to be true in market valuation. In this paper we test the Fed model from historical perspective on the European markets. The markets of the United States are also includedfor comparison. The purpose of the tests is to determine if the Fed model and the underlying assumptions come true on different markets. The various tests are made on time-series data ranging from the year 1973 to the end of the year 2008. The statistical methods used are regressions analysis, cointegration analysis and Granger causality. The empirical results do not give strong support for the Fed model. The underlying relationships assumed by the Fed model are statistically not valid in most of the markets examined and therefore the model is not valid in valuation purposes generally. The results vary between the different markets which gives reason to suspect the general use of the Fed model in different market conditions and in different markets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study focuses on observing how Finnish companies execute their new product launch processes. The main objective was to find out how entry timing moderates the relationship between launch tactics (namely product innovativeness, price and emotional advertising) and new product performance (namely sales volume and customer profitability). The empirical analysis was based on data collected in Lappeenranta University of Technology. The sample consisted of Finnish companies representing different industries and innovation activities. Altogether 272 usable responses were received representing a response rate of 37.67%. The measures were first assessed by using exploratory factor analysis (EFA) in PASW Statistics 18 and then further verified with confirmatory factor analysis (CFA) in LISREL 8.80. To test the hypotheses of the moderating effects of entry timing, hierarchical regression analysis was used in PASW Statistics 18. The results of the study revealed that the effect of product innovativeness on new product sales volume is dependent on entry timing. This implies that companies should carefully consider what would be the best time for entering the market when launching highly innovative new products. The results also depict a positive relationship between emotional advertising and new product sales volume. In addition, partial support was found for a positive relationship between pricing and new product customer profitability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During endodontic therapy (pulpectomy, root canal debridement and root canal filling) microbiological management is a major concern. Bacteria present in dentine tubules, apical foramina and apical delta are causally related to failure of the procedure. Studies have shown that during single session endodontic treatment bacteria remain within dental structures. The aim of the present study was to evaluate endodontic treatment performed as two sessions, using temporary endodontic dressing materials for different periods in four groups of experimental dogs. A total of 80 roots of second and third upper premolar teeth and second, third and fourth lower premolar teeth were divided into four groups. The pulp chamber was opened with burrs and the pulp exposed for 60 days to induce pulpal inflammation and necrosis. Groups II, III and IV were treated with calcium hydroxide plus camphorated paramono-chlorophenol (PMCC) for 7, 15 and 30 days, respectively. In all groups, the root canals were filled with zinc oxide-eugenol and gutta-percha cones. Clinical and radiographical measurements were performed every 2 weeks. After 60 days a small block section containing the teeth, surrounding periapical tissues and the periodontium was removed for histological and microbiological study. Histological analysis revealed intense inflammatory response in all groups. Microbiological analysis showed microbial reduction inversely proportional to the period of time that the intracanal temporary medicament was left in place.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The initial timing of face-specific effects in event-related potentials (ERPs) is a point of contention in face processing research. Although effects during the time of the N170 are robust in the literature, inconsistent effects during the time of the P100 challenge the interpretation of the N170 as being the initial face-specific ERP effect. The interpretation of the early P100 effects are often attributed to low-level differences between face stimuli and a host of other image categories. Research using sophisticated controls for low-level stimulus characteristics (Rousselet, Husk, Bennett, & Sekuler, 2008) report robust face effects starting at around 130 ms following stimulus onset. The present study examines the independent components (ICs) of the P100 and N170 complex in the context of a minimally controlled low-level stimulus set and a clear P100 effect for faces versus houses at the scalp. Results indicate that four ICs account for the ERPs to faces and houses in the first 200ms following stimulus onset. The IC that accounts for the majority of the scalp N170 (icNla) begins dissociating stimulus conditions at approximately 130 ms, closely replicating the scalp results of Rousselet et al. (2008). The scalp effects at the time of the P100 are accounted for by two constituent ICs (icP1a and icP1b). The IC that projects the greatest voltage at the scalp during the P100 (icP1a) shows a face-minus-house effect over the period of the P100 that is less robust than the N 170 effect of icN 1 a when measured as the average of single subject differential activation robustness. The second constituent process of the P100 (icP1b), although projecting a smaller voltage to the scalp than icP1a, shows a more robust effect for the face-minus-house contrast starting prior to 100 ms following stimulus onset. Further, the effect expressed by icP1 b takes the form of a larger negative projection to medial occipital sites for houses over faces partially canceling the larger projection of icP1a, thereby enhancing the face positivity at this time. These findings have three main implications for ERP research on face processing: First, the ICs that constitute the face-minus-house P100 effect are independent from the ICs that constitute the N170 effect. This suggests that the P100 effect and the N170 effect are anatomically independent. Second, the timing of the N170 effect can be recovered from scalp ERPs that have spatio-temporally overlapping effects possibly associated with low-level stimulus characteristics. This unmixing of the EEG signals may reduce the need for highly constrained stimulus sets, a characteristic that is not always desirable for a topic that is highly coupled to ecological validity. Third, by unmixing the constituent processes of the EEG signals new analysis strategies are made available. In particular the exploration of the relationship between cortical processes over the period of the P100 and N170 ERP complex (and beyond) may provide previously unaccessible answers to questions such as: Is the face effect a special relationship between low-level and high-level processes along the visual stream?

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Consumption of low-fat milk (LFM) after resistance training has been shown to have positive influences on body composition and training adaptations; however, little research has examined the effects of LFM consumption following endurance training. The purpose of the study was to look at the effects of combining additional servings of LFM following endurance exercise on body composition, bone health, and training adaptations. 40 healthy males were recruited. Individuals were randomized into 4 groups – DEI (750mL LFM immediately post exercise), DEA (750mL LFM 4 hrs prior to or 6 hrs post exercise), CEI (750mL carbohydrate beverage immediately post-exercise), and CEA (750mL carbohydrate beverage immediately post-exercise). Participants took part in a 12-week endurance training intervention (1 h/day, 3 d/wk, ~60% max HR). 22 participants completed the study. Analysis showed significant increases in lean mass, spinal bone mineral content, relative VO2peak, and a decrease in Trap 5β across all groups (p < 0.05).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Affiliation: Dany Gagnon & Sylvie Nadeau: École de réadaptation, Faculté de médecine, Université de Montréal & Centre de recherche interdisciplinaire en réadaptation, Institut de réadaptation de Montréal

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les systèmes Matériels/Logiciels deviennent indispensables dans tous les aspects de la vie quotidienne. La présence croissante de ces systèmes dans les différents produits et services incite à trouver des méthodes pour les développer efficacement. Mais une conception efficace de ces systèmes est limitée par plusieurs facteurs, certains d'entre eux sont: la complexité croissante des applications, une augmentation de la densité d'intégration, la nature hétérogène des produits et services, la diminution de temps d’accès au marché. Une modélisation transactionnelle (TLM) est considérée comme un paradigme prometteur permettant de gérer la complexité de conception et fournissant des moyens d’exploration et de validation d'alternatives de conception à des niveaux d’abstraction élevés. Cette recherche propose une méthodologie d’expression de temps dans TLM basée sur une analyse de contraintes temporelles. Nous proposons d'utiliser une combinaison de deux paradigmes de développement pour accélérer la conception: le TLM d'une part et une méthodologie d’expression de temps entre différentes transactions d’autre part. Cette synergie nous permet de combiner dans un seul environnement des méthodes de simulation performantes et des méthodes analytiques formelles. Nous avons proposé un nouvel algorithme de vérification temporelle basé sur la procédure de linéarisation des contraintes de type min/max et une technique d'optimisation afin d'améliorer l'efficacité de l'algorithme. Nous avons complété la description mathématique de tous les types de contraintes présentées dans la littérature. Nous avons développé des méthodes d'exploration et raffinement de système de communication qui nous a permis d'utiliser les algorithmes de vérification temporelle à différents niveaux TLM. Comme il existe plusieurs définitions du TLM, dans le cadre de notre recherche, nous avons défini une méthodologie de spécification et simulation pour des systèmes Matériel/Logiciel basée sur le paradigme de TLM. Dans cette méthodologie plusieurs concepts de modélisation peuvent être considérés séparément. Basée sur l'utilisation des technologies modernes de génie logiciel telles que XML, XSLT, XSD, la programmation orientée objet et plusieurs autres fournies par l’environnement .Net, la méthodologie proposée présente une approche qui rend possible une réutilisation des modèles intermédiaires afin de faire face à la contrainte de temps d’accès au marché. Elle fournit une approche générale dans la modélisation du système qui sépare les différents aspects de conception tels que des modèles de calculs utilisés pour décrire le système à des niveaux d’abstraction multiples. En conséquence, dans le modèle du système nous pouvons clairement identifier la fonctionnalité du système sans les détails reliés aux plateformes de développement et ceci mènera à améliorer la "portabilité" du modèle d'application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Changes in mature forest cover amount, composition, and configuration can be of significant consequence to wildlife populations. The response of wildlife to forest patterns is of concern to forest managers because it lies at the heart of such competing approaches to forest planning as aggregated vs. dispersed harvest block layouts. In this study, we developed a species assessment framework to evaluate the outcomes of forest management scenarios on biodiversity conservation objectives. Scenarios were assessed in the context of a broad range of forest structures and patterns that would be expected to occur under natural disturbance and succession processes. Spatial habitat models were used to predict the effects of varying degrees of mature forest cover amount, composition, and configuration on habitat occupancy for a set of 13 focal songbird species. We used a spatially explicit harvest scheduling program to model forest management options and simulate future forest conditions resulting from alternative forest management scenarios, and used a process-based fire-simulation model to simulate future forest conditions resulting from natural wildfire disturbance. Spatial pattern signatures were derived for both habitat occupancy and forest conditions, and these were placed in the context of the simulated range of natural variation. Strategic policy analyses were set in the context of current Ontario forest management policies. This included use of sequential time-restricted harvest blocks (created for Woodland caribou (Rangifer tarandus) conservation) and delayed harvest areas (created for American marten (Martes americana atrata) conservation). This approach increased the realism of the analysis, but reduced the generality of interpretations. We found that forest management options that create linear strips of old forest deviate the most from simulated natural patterns, and had the greatest negative effects on habitat occupancy, whereas policy options that specify deferment and timing of harvest for large blocks helped ensure the stable presence of an intact mature forest matrix over time. The management scenario that focused on maintaining compositional targets best supported biodiversity objectives by providing the composition patterns required by the 13 focal species, but this scenario may be improved by adding some broad-scale spatial objectives to better maintain large blocks of interior forest habitat through time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bloom-forming and toxin-producing cyanobacteria remain a persistent nuisance across the world. Modelling of cyanobacteria in freshwaters is an important tool for understanding their population dynamics and predicting the location and timing of the bloom events in lakes and rivers. A new deterministic-mathematical model was developed, which simulates the growth and movement of cyanobacterial blooms in river systems. The model focuses on the mathematical description of the bloom formation, vertical migration and lateral transport of colonies within river environments by taking into account the major factors that affect the cyanobacterial bloom formation in rivers including, light, nutrients and temperature. A technique called generalised sensitivity analysis was applied to the model to identify the critical parameter uncertainties in the model and investigates the interaction between the chosen parameters of the model. The result of the analysis suggested that 8 out of 12 parameters were significant in obtaining the observed cyanobacterial behaviour in a simulation. It was found that there was a high degree of correlation between the half-saturation rate constants used in the model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Transient episodes of synchronisation of neuronal activity in particular frequency ranges are thought to underlie cognition. Empirical mode decomposition phase locking (EMDPL) analysis is a method for determining the frequency and timing of phase synchrony that is adaptive to intrinsic oscillations within data, alleviating the need for arbitrary bandpass filter cut-off selection. It is extended here to address the choice of reference electrode and removal of spurious synchrony resulting from volume conduction. Spline Laplacian transformation and independent component analysis (ICA) are performed as pre-processing steps, and preservation of phase synchrony between synthetic signals. combined using a simple forward model, is demonstrated. The method is contrasted with use of bandpass filtering following the same preprocessing steps, and filter cut-offs are shown to influence synchrony detection markedly. Furthermore, an approach to the assessment of multiple EEG trials using the method is introduced, and the assessment of statistical significance of phase locking episodes is extended to render it adaptive to local phase synchrony levels. EMDPL is validated in the analysis of real EEG data, during finger tapping. The time course of event-related (de)synchronisation (ERD/ERS) is shown to differ from that of longer range phase locking episodes, implying different roles for these different types of synchronisation. It is suggested that the increase in phase locking which occurs just prior to movement, coinciding with a reduction in power (or ERD) may result from selection of the neural assembly relevant to the particular movement. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bloom-forming and toxin-producing cyanobacteria remain a persistent nuisance across the world. Modelling cyanobacterial behaviour in freshwaters is an important tool for understanding their population dynamics and predicting the location and timing of the bloom events in lakes, reservoirs and rivers. A new deterministic–mathematical model was developed, which simulates the growth and movement of cyanobacterial blooms in river systems. The model focuses on the mathematical description of the bloom formation, vertical migration and lateral transport of colonies within river environments by taking into account the major factors that affect the cyanobacterial bloom formation in rivers including light, nutrients and temperature. A parameter sensitivity analysis using a one-at-a-time approach was carried out. There were two objectives of the sensitivity analysis presented in this paper: to identify the key parameters controlling the growth and movement patterns of cyanobacteria and to provide a means for model validation. The result of the analysis suggested that maximum growth rate and day length period were the most significant parameters in determining the population growth and colony depth, respectively.