970 resultados para Timing analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Geographical and temporal variations in the start dates of grass pollen seasons are described for selected sites of the European Pollen Information Service. Daily average grass pollen counts are derived from Network sites in Finland, the Netherlands, Denmark, United Kingdom, Austria, Italy and Spain, giving a broad longitudinal transect over Western Europe. The study is part of a larger project that also examines annual and regional variations in the severity, timing of the peak and duration of the grass pollen seasons. For several sites, data are available for over twenty years enabling long term trends to be discerned. The analyses show notable contrasts in the progression of the seasons annually with differing lag times occurring between southern and northern sites in various years depending on the weather conditions. The patterns identified provide some insight into geographical differences and temporal trends in the incidence of pollinosis. The paper discusses the main difficulties involved in this type of analysis and notes possibilities for using data from the European Pollen Information service to construct pan European predictive models for pollen seasons.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Indices of post awakening cortisol secretion (PACS), include the rise in cortisol(cortisol awakening response: CAR) and overall cortisol concentrations (e.g. area under the curve with reference to ground: AUCg) in the first 30—45 min. Both are commonly investigated in relation to psychosocial variables. Although sampling within the domestic setting is ecologically valid, participant non-adherence to the required timing protocol results in erroneous measurement of PACS and this may explain discrepancies in the literature linking these measures to trait well-being (TWB). We have previously shown that delays of little over 5 min(between awakening and the start of sampling) to result in erroneous CAR estimates. In this study, we report for the first time on the negative impact of sample timing inaccuracy (verified by electronic-monitoring) on the efficacy to detect significant relationships between PACS and TWB when measured in the domestic setting.Healthy females (N = 49, 20.5 ± 2.8 years) selected for differences in TWB collected saliva samples (S1—4) on 4 days at 0, 15, 30, 45 min post awakening, to determine PACS. Adherence to the sampling protocol was objectively monitored using a combination of electronic estimates of awakening (actigraphy) and sampling times (track caps).Relationships between PACS and TWB were found to depend on sample timing accuracy. Lower TWB was associated with higher post awakening cortisol AUCg in proportion to the mean sample timing accuracy (p < .005). There was no association between TWB and the CAR even taking into account sample timing accuracy. These results highlight the importance of careful electronic monitoring of participant adherence for measurement of PACS in the domestic setting. Mean sample timing inaccuracy, mainly associated with delays of >5 min between awakening and collection of sample 1 (median = 8 min delay), negatively impacts on the sensitivity of analysis to detect associations between PACS and TWB.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of multicores is becoming widespread inthe field of embedded systems, many of which have real-time requirements. Hence, ensuring that real-time applications meet their timing constraints is a pre-requisite before deploying them on these systems. This necessitates the consideration of the impact of the contention due to shared lowlevel hardware resources like the front-side bus (FSB) on the Worst-CaseExecution Time (WCET) of the tasks. Towards this aim, this paper proposes a method to determine an upper bound on the number of bus requests that tasks executing on a core can generate in a given time interval. We show that our method yields tighter upper bounds in comparison with the state of-the-art. We then apply our method to compute the extra contention delay incurred by tasks, when they are co-scheduled on different cores and access the shared main memory, using a shared bus, access to which is granted using a round-robin arbitration (RR) protocol.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PRINCIPLES: Advance directives are seen as an important tool for documenting the wishes of patients who are no longer competent to make decisions in regards to their medical care. Due to their nature, approaching the subject of advance directives with a patient can be difficult for both the medical care provider and the patient. This paper focuses on general practitioners' perspectives regarding the timing at which this discussion should take place, as well as the advantages and disadvantages of the different moments. METHODS: In 2013, 23 semi-structured face-to-face interviews were performed with Swiss general practitioners. Interviews were analysed using qualitative content analysis. RESULTS: In our sample, 23 general practitioners provided different options that they felt were appropriate moments: either (a) when the patient is still healthy, (b) when illness becomes predominant, or (c) when a patient has been transferred to a long-term care facility. Furthermore, general practitioners reported uncertainty and discomfort regarding initiating the discussion. CONCLUSION: The distinct approaches, perspectives and rationales show that there is no well-defined or "right" moment. However, participants often associated advance directives with death. This link caused discomfort and uncertainty, which led to hesitation and delay on the part of general practitioners. Therefore we recommend further training on how to professionally initiate a conversation about advance directives. Furthermore, based on our results and experience, we recommend an early approach with healthy patients paired with later regular updates as it seems to be the most effective way to inform patients about their end-of-life care options.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

QUESTIONS UNDER STUDY: The diagnostic significance of clinical symptoms/signs of influenza has mainly been assessed in the context of controlled studies with stringent inclusion criteria. There was a need to extend the evaluation of these predictors not only in the context of general practice but also according to the duration of symptoms and to the dynamics of the epidemic. PRINCIPLES: A prospective study conducted in the Medical Outpatient Clinic in the winter season 1999-2000. Patients with influenza-like syndrome were included, as long as the primary care physician envisaged the diagnosis of influenza. The physician administered a questionnaire, a throat swab was performed and a culture acquired to document the diagnosis of influenza. RESULTS: 201 patients were included in the study. 52% were culture positive for influenza. By univariate analysis, temperature >37.8 degrees C (OR 4.2; 95% CI 2.3-7.7), duration of symptoms <48 hours (OR 3.2; 1.8-5.7), cough (OR 3.2; 1-10.4) and myalgia (OR 2.8; 1.0-7.5) were associated with a diagnosis of influenza. In a multivariable logistic analysis, the best model predicting influenza was the association of a duration of symptom <48 hours, medical attendance at the beginning of the epidemic (weeks 49-50), fever >37.8 and cough, with a sensitivity of 79%, specificity of 69%, positive predictive value of 67%, negative predictive value of 73% and an area under the ROC curve of 0.74. CONCLUSIONS: Besides relevant symptoms and signs, the physician should also consider the duration of symptoms and the epidemiological context (start, peak or end of the epidemic) in his appraisal, since both parameters considerably modify the value of the clinical predictors when assessing the probability of a patient having influenza.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The initial timing of face-specific effects in event-related potentials (ERPs) is a point of contention in face processing research. Although effects during the time of the N170 are robust in the literature, inconsistent effects during the time of the P100 challenge the interpretation of the N170 as being the initial face-specific ERP effect. The interpretation of the early P100 effects are often attributed to low-level differences between face stimuli and a host of other image categories. Research using sophisticated controls for low-level stimulus characteristics (Rousselet, Husk, Bennett, & Sekuler, 2008) report robust face effects starting at around 130 ms following stimulus onset. The present study examines the independent components (ICs) of the P100 and N170 complex in the context of a minimally controlled low-level stimulus set and a clear P100 effect for faces versus houses at the scalp. Results indicate that four ICs account for the ERPs to faces and houses in the first 200ms following stimulus onset. The IC that accounts for the majority of the scalp N170 (icNla) begins dissociating stimulus conditions at approximately 130 ms, closely replicating the scalp results of Rousselet et al. (2008). The scalp effects at the time of the P100 are accounted for by two constituent ICs (icP1a and icP1b). The IC that projects the greatest voltage at the scalp during the P100 (icP1a) shows a face-minus-house effect over the period of the P100 that is less robust than the N 170 effect of icN 1 a when measured as the average of single subject differential activation robustness. The second constituent process of the P100 (icP1b), although projecting a smaller voltage to the scalp than icP1a, shows a more robust effect for the face-minus-house contrast starting prior to 100 ms following stimulus onset. Further, the effect expressed by icP1 b takes the form of a larger negative projection to medial occipital sites for houses over faces partially canceling the larger projection of icP1a, thereby enhancing the face positivity at this time. These findings have three main implications for ERP research on face processing: First, the ICs that constitute the face-minus-house P100 effect are independent from the ICs that constitute the N170 effect. This suggests that the P100 effect and the N170 effect are anatomically independent. Second, the timing of the N170 effect can be recovered from scalp ERPs that have spatio-temporally overlapping effects possibly associated with low-level stimulus characteristics. This unmixing of the EEG signals may reduce the need for highly constrained stimulus sets, a characteristic that is not always desirable for a topic that is highly coupled to ecological validity. Third, by unmixing the constituent processes of the EEG signals new analysis strategies are made available. In particular the exploration of the relationship between cortical processes over the period of the P100 and N170 ERP complex (and beyond) may provide previously unaccessible answers to questions such as: Is the face effect a special relationship between low-level and high-level processes along the visual stream?

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Consumption of low-fat milk (LFM) after resistance training has been shown to have positive influences on body composition and training adaptations; however, little research has examined the effects of LFM consumption following endurance training. The purpose of the study was to look at the effects of combining additional servings of LFM following endurance exercise on body composition, bone health, and training adaptations. 40 healthy males were recruited. Individuals were randomized into 4 groups – DEI (750mL LFM immediately post exercise), DEA (750mL LFM 4 hrs prior to or 6 hrs post exercise), CEI (750mL carbohydrate beverage immediately post-exercise), and CEA (750mL carbohydrate beverage immediately post-exercise). Participants took part in a 12-week endurance training intervention (1 h/day, 3 d/wk, ~60% max HR). 22 participants completed the study. Analysis showed significant increases in lean mass, spinal bone mineral content, relative VO2peak, and a decrease in Trap 5β across all groups (p < 0.05).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Affiliation: Dany Gagnon & Sylvie Nadeau: École de réadaptation, Faculté de médecine, Université de Montréal & Centre de recherche interdisciplinaire en réadaptation, Institut de réadaptation de Montréal

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les systèmes Matériels/Logiciels deviennent indispensables dans tous les aspects de la vie quotidienne. La présence croissante de ces systèmes dans les différents produits et services incite à trouver des méthodes pour les développer efficacement. Mais une conception efficace de ces systèmes est limitée par plusieurs facteurs, certains d'entre eux sont: la complexité croissante des applications, une augmentation de la densité d'intégration, la nature hétérogène des produits et services, la diminution de temps d’accès au marché. Une modélisation transactionnelle (TLM) est considérée comme un paradigme prometteur permettant de gérer la complexité de conception et fournissant des moyens d’exploration et de validation d'alternatives de conception à des niveaux d’abstraction élevés. Cette recherche propose une méthodologie d’expression de temps dans TLM basée sur une analyse de contraintes temporelles. Nous proposons d'utiliser une combinaison de deux paradigmes de développement pour accélérer la conception: le TLM d'une part et une méthodologie d’expression de temps entre différentes transactions d’autre part. Cette synergie nous permet de combiner dans un seul environnement des méthodes de simulation performantes et des méthodes analytiques formelles. Nous avons proposé un nouvel algorithme de vérification temporelle basé sur la procédure de linéarisation des contraintes de type min/max et une technique d'optimisation afin d'améliorer l'efficacité de l'algorithme. Nous avons complété la description mathématique de tous les types de contraintes présentées dans la littérature. Nous avons développé des méthodes d'exploration et raffinement de système de communication qui nous a permis d'utiliser les algorithmes de vérification temporelle à différents niveaux TLM. Comme il existe plusieurs définitions du TLM, dans le cadre de notre recherche, nous avons défini une méthodologie de spécification et simulation pour des systèmes Matériel/Logiciel basée sur le paradigme de TLM. Dans cette méthodologie plusieurs concepts de modélisation peuvent être considérés séparément. Basée sur l'utilisation des technologies modernes de génie logiciel telles que XML, XSLT, XSD, la programmation orientée objet et plusieurs autres fournies par l’environnement .Net, la méthodologie proposée présente une approche qui rend possible une réutilisation des modèles intermédiaires afin de faire face à la contrainte de temps d’accès au marché. Elle fournit une approche générale dans la modélisation du système qui sépare les différents aspects de conception tels que des modèles de calculs utilisés pour décrire le système à des niveaux d’abstraction multiples. En conséquence, dans le modèle du système nous pouvons clairement identifier la fonctionnalité du système sans les détails reliés aux plateformes de développement et ceci mènera à améliorer la "portabilité" du modèle d'application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Changes in mature forest cover amount, composition, and configuration can be of significant consequence to wildlife populations. The response of wildlife to forest patterns is of concern to forest managers because it lies at the heart of such competing approaches to forest planning as aggregated vs. dispersed harvest block layouts. In this study, we developed a species assessment framework to evaluate the outcomes of forest management scenarios on biodiversity conservation objectives. Scenarios were assessed in the context of a broad range of forest structures and patterns that would be expected to occur under natural disturbance and succession processes. Spatial habitat models were used to predict the effects of varying degrees of mature forest cover amount, composition, and configuration on habitat occupancy for a set of 13 focal songbird species. We used a spatially explicit harvest scheduling program to model forest management options and simulate future forest conditions resulting from alternative forest management scenarios, and used a process-based fire-simulation model to simulate future forest conditions resulting from natural wildfire disturbance. Spatial pattern signatures were derived for both habitat occupancy and forest conditions, and these were placed in the context of the simulated range of natural variation. Strategic policy analyses were set in the context of current Ontario forest management policies. This included use of sequential time-restricted harvest blocks (created for Woodland caribou (Rangifer tarandus) conservation) and delayed harvest areas (created for American marten (Martes americana atrata) conservation). This approach increased the realism of the analysis, but reduced the generality of interpretations. We found that forest management options that create linear strips of old forest deviate the most from simulated natural patterns, and had the greatest negative effects on habitat occupancy, whereas policy options that specify deferment and timing of harvest for large blocks helped ensure the stable presence of an intact mature forest matrix over time. The management scenario that focused on maintaining compositional targets best supported biodiversity objectives by providing the composition patterns required by the 13 focal species, but this scenario may be improved by adding some broad-scale spatial objectives to better maintain large blocks of interior forest habitat through time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bloom-forming and toxin-producing cyanobacteria remain a persistent nuisance across the world. Modelling of cyanobacteria in freshwaters is an important tool for understanding their population dynamics and predicting the location and timing of the bloom events in lakes and rivers. A new deterministic-mathematical model was developed, which simulates the growth and movement of cyanobacterial blooms in river systems. The model focuses on the mathematical description of the bloom formation, vertical migration and lateral transport of colonies within river environments by taking into account the major factors that affect the cyanobacterial bloom formation in rivers including, light, nutrients and temperature. A technique called generalised sensitivity analysis was applied to the model to identify the critical parameter uncertainties in the model and investigates the interaction between the chosen parameters of the model. The result of the analysis suggested that 8 out of 12 parameters were significant in obtaining the observed cyanobacterial behaviour in a simulation. It was found that there was a high degree of correlation between the half-saturation rate constants used in the model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Transient episodes of synchronisation of neuronal activity in particular frequency ranges are thought to underlie cognition. Empirical mode decomposition phase locking (EMDPL) analysis is a method for determining the frequency and timing of phase synchrony that is adaptive to intrinsic oscillations within data, alleviating the need for arbitrary bandpass filter cut-off selection. It is extended here to address the choice of reference electrode and removal of spurious synchrony resulting from volume conduction. Spline Laplacian transformation and independent component analysis (ICA) are performed as pre-processing steps, and preservation of phase synchrony between synthetic signals. combined using a simple forward model, is demonstrated. The method is contrasted with use of bandpass filtering following the same preprocessing steps, and filter cut-offs are shown to influence synchrony detection markedly. Furthermore, an approach to the assessment of multiple EEG trials using the method is introduced, and the assessment of statistical significance of phase locking episodes is extended to render it adaptive to local phase synchrony levels. EMDPL is validated in the analysis of real EEG data, during finger tapping. The time course of event-related (de)synchronisation (ERD/ERS) is shown to differ from that of longer range phase locking episodes, implying different roles for these different types of synchronisation. It is suggested that the increase in phase locking which occurs just prior to movement, coinciding with a reduction in power (or ERD) may result from selection of the neural assembly relevant to the particular movement. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bloom-forming and toxin-producing cyanobacteria remain a persistent nuisance across the world. Modelling cyanobacterial behaviour in freshwaters is an important tool for understanding their population dynamics and predicting the location and timing of the bloom events in lakes, reservoirs and rivers. A new deterministic–mathematical model was developed, which simulates the growth and movement of cyanobacterial blooms in river systems. The model focuses on the mathematical description of the bloom formation, vertical migration and lateral transport of colonies within river environments by taking into account the major factors that affect the cyanobacterial bloom formation in rivers including light, nutrients and temperature. A parameter sensitivity analysis using a one-at-a-time approach was carried out. There were two objectives of the sensitivity analysis presented in this paper: to identify the key parameters controlling the growth and movement patterns of cyanobacteria and to provide a means for model validation. The result of the analysis suggested that maximum growth rate and day length period were the most significant parameters in determining the population growth and colony depth, respectively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a comparative analysis of projected impacts of climate change on river runoff from two types of distributed hydrological model, a global hydrological model (GHM) and catchment-scale hydrological models (CHM). Analyses are conducted for six catchments that are global in coverage and feature strong contrasts in spatial scale as well as climatic and development conditions. These include the Liard (Canada), Mekong (SE Asia), Okavango (SW Africa), Rio Grande (Brazil), Xiangu (China) and Harper's Brook (UK). A single GHM (Mac-PDM.09) is applied to all catchments whilst different CHMs are applied for each catchment. The CHMs typically simulate water resources impacts based on a more explicit representation of catchment water resources than that available from the GHM, and the CHMs include river routing. Simulations of average annual runoff, mean monthly runoff and high (Q5) and low (Q95) monthly runoff under baseline (1961-1990) and climate change scenarios are presented. We compare the simulated runoff response of each hydrological model to (1) prescribed increases in global mean temperature from the HadCM3 climate model and (2)a prescribed increase in global-mean temperature of 2oC for seven GCMs to explore response to climate model and structural uncertainty. We find that differences in projected changes of mean annual runoff between the two types of hydrological model can be substantial for a given GCM, and they are generally larger for indicators of high and low flow. However, they are relatively small in comparison to the range of projections across the seven GCMs. Hence, for the six catchments and seven GCMs we considered, climate model structural uncertainty is greater than the uncertainty associated with the type of hydrological model applied. Moreover, shifts in the seasonal cycle of runoff with climate change are presented similarly by both hydrological models, although for some catchments the monthly timing of high and low flows differs.This implies that for studies that seek to quantify and assess the role of climate model uncertainty on catchment-scale runoff, it may be equally as feasible to apply a GHM as it is to apply a CHM, especially when climate modelling uncertainty across the range of available GCMs is as large as it currently is. Whilst the GHM is able to represent the broad climate change signal that is represented by the CHMs, we find, however, that for some catchments there are differences between GHMs and CHMs in mean annual runoff due to differences in potential evaporation estimation methods, in the representation of the seasonality of runoff, and in the magnitude of changes in extreme monthly runoff, all of which have implications for future water management issues.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Carsberg (2002) suggested that the periodic valuation accuracy studies undertaken by, amongst others, IPD/Drivers Jonas (2003) should be undertaken every year and be sponsored by the RICS, which acts as the self-regulating body for valuations in the UK. This paper does not address the wider issues concerning the nature of properties which are sold and whether the sale prices are influenced by prior valuations, but considers solely the technical issues concerning the timing of the valuation and sales data. This study uses valuations and sales data from the Investment Property Databank UK Monthly Index to attempt to identify the date that sale data is divulged to valuers. This information will inform accuracy studies that use a cut-off date as to the closeness of valuations to sales completion date as a yardstick for excluding data from the analysis. It will also, assuming valuers are informed quickly of any agreed sales, help to determine the actual sale agreed date rather than the completion date, which includes a period of due diligence between when the sale is agreed and its completion. Valuations should be updated to this date, rather than the formal completion date, if a reliable measure of valuation accuracy is to be determined. An accuracy study is then undertaken using a variety of updating periods and the differences between the results are examined. The paper concludes that the sale only becomes known to valuers in the month prior to the sale taking place and that this assumes either that sales due diligence procedures are shortening or valuers are not told quickly of agreed sale prices. Studies that adopt a four-month cut-off date for any valuations compared to sales completion dates are over cautious, and this could be reduced to two months without compromising the data.