832 resultados para resting interval
Resumo:
Motor timing tasks have been employed in studies of neurodevelopmental disorders such as developmental dyslexia and ADHD, where they provide an index of temporal processing ability. Investigations of these disorders have used different stimulus parameters within the motor timing tasks which are likely to affect performance measures. Here we assessed the effect of auditory and visual pacing stimuli on synchronised motor timing performance and its relationship with cognitive and behavioural predictors that are commonly used in the diagnosis of these highly prevalent developmental disorders. Twenty- one children (mean age 9.6 years) completed a finger tapping task in two stimulus conditions, together with additional psychometric measures. As anticipated, synchronisation to the beat (ISI 329 ms) was less accurate in the visually paced condition. Decomposition of timing variance indicated that this effect resulted from differences in the way that visual and auditory paced tasks are processed by central timekeeping and associated peripheral implementation systems. The ability to utilise an efficient processing strategy on the visual task correlated with both reading and sustained attention skills. Dissociations between these patterns of relationship across task modality suggest that not all timing tasks are equivalent.
Resumo:
Background - Several antipsychotic agents are known to prolong the QT interval in a dose dependent manner. Corrected QT interval (QTc) exceeding a threshold value of 450 ms may be associated with an increased risk of life threatening arrhythmias. Antipsychotic agents are often given in combination with other psychotropic drugs, such as antidepressants, that may also contribute to QT prolongation. This observational study compares the effects observed on QT interval between antipsychotic monotherapy and psychoactive polytherapy, which included an additional antidepressant or lithium treatment. Method - We examined two groups of hospitalized women with Schizophrenia, Bipolar Disorder and Schizoaffective Disorder in a naturalistic setting. Group 1 was composed of nineteen hospitalized women treated with antipsychotic monotherapy (either haloperidol, olanzapine, risperidone or clozapine) and Group 2 was composed of nineteen hospitalized women treated with an antipsychotic (either haloperidol, olanzapine, risperidone or quetiapine) with an additional antidepressant (citalopram, escitalopram, sertraline, paroxetine, fluvoxamine, mirtazapine, venlafaxine or clomipramine) or lithium. An Electrocardiogram (ECG) was carried out before the beginning of the treatment for both groups and at a second time after four days of therapy at full dosage, when blood was also drawn for determination of serum levels of the antipsychotic. Statistical analysis included repeated measures ANOVA, Fisher Exact Test and Indipendent T Test. Results - Mean QTc intervals significantly increased in Group 2 (24 ± 21 ms) however this was not the case in Group 1 (-1 ± 30 ms) (Repeated measures ANOVA p < 0,01). Furthermore we found a significant difference in the number of patients who exceeded the threshold of borderline QTc interval value (450 ms) between the two groups, with seven patients in Group 2 (38%) compared to one patient in Group 1 (7%) (Fisher Exact Text, p < 0,05). Conclusions - No significant prolongation of the QT interval was found following monotherapy with an antipsychotic agent, while combination of these drugs with antidepressants caused a significant QT prolongation. Careful monitoring of the QT interval is suggested in patients taking a combined treatment of antipsychotic and antidepressant agents.
Resumo:
The increasing intensity of global competition has led organizations to utilize various types of performance measurement tools for improving the quality of their products and services. Data envelopment analysis (DEA) is a methodology for evaluating and measuring the relative efficiencies of a set of decision making units (DMUs) that use multiple inputs to produce multiple outputs. All the data in the conventional DEA with input and/or output ratios assumes the form of crisp numbers. However, the observed values of data in real-world problems are sometimes expressed as interval ratios. In this paper, we propose two new models: general and multiplicative non-parametric ratio models for DEA problems with interval data. The contributions of this paper are fourfold: (1) we consider input and output data expressed as interval ratios in DEA; (2) we address the gap in DEA literature for problems not suitable or difficult to model with crisp values; (3) we propose two new DEA models for evaluating the relative efficiencies of DMUs with interval ratios, and (4) we present a case study involving 20 banks with three interval ratios to demonstrate the applicability and efficacy of the proposed models where the traditional indicators are mostly financial ratios. © 2011 Elsevier Inc.
Resumo:
Although crisp data are fundamentally indispensable for determining the profit Malmquist productivity index (MPI), the observed values in real-world problems are often imprecise or vague. These imprecise or vague data can be suitably characterized with fuzzy and interval methods. In this paper, we reformulate the conventional profit MPI problem as an imprecise data envelopment analysis (DEA) problem, and propose two novel methods for measuring the overall profit MPI when the inputs, outputs, and price vectors are fuzzy or vary in intervals. We develop a fuzzy version of the conventional MPI model by using a ranking method, and solve the model with a commercial off-the-shelf DEA software package. In addition, we define an interval for the overall profit MPI of each decision-making unit (DMU) and divide the DMUs into six groups according to the intervals obtained for their overall profit efficiency and MPIs. We also present two numerical examples to demonstrate the applicability of the two proposed models and exhibit the efficacy of the procedures and algorithms. © 2011 Elsevier Ltd.
Resumo:
Conventional DEA models assume deterministic, precise and non-negative data for input and output observations. However, real applications may be characterized by observations that are given in form of intervals and include negative numbers. For instance, the consumption of electricity in decentralized energy resources may be either negative or positive, depending on the heat consumption. Likewise, the heat losses in distribution networks may be within a certain range, depending on e.g. external temperature and real-time outtake. Complementing earlier work separately addressing the two problems; interval data and negative data; we propose a comprehensive evaluation process for measuring the relative efficiencies of a set of DMUs in DEA. In our general formulation, the intervals may contain upper or lower bounds with different signs. The proposed method determines upper and lower bounds for the technical efficiency through the limits of the intervals after decomposition. Based on the interval scores, DMUs are then classified into three classes, namely, the strictly efficient, weakly efficient and inefficient. An intuitive ranking approach is presented for the respective classes. The approach is demonstrated through an application to the evaluation of bank branches. © 2013.
Resumo:
We propose an arithmetic of function intervals as a basis for convenient rigorous numerical computation. Function intervals can be used as mathematical objects in their own right or as enclosures of functions over the reals. We present two areas of application of function interval arithmetic and associated software that implements the arithmetic: (1) Validated ordinary differential equation solving using the AERN library and within the Acumen hybrid system modeling tool. (2) Numerical theorem proving using the PolyPaver prover. © 2014 Springer-Verlag.
Resumo:
Electrocardiography (ECG) has been recently proposed as biometric trait for identification purposes. Intra-individual variations of ECG might affect identification performance. These variations are mainly due to Heart Rate Variability (HRV). In particular, HRV causes changes in the QT intervals along the ECG waveforms. This work is aimed at analysing the influence of seven QT interval correction methods (based on population models) on the performance of ECG-fiducial-based identification systems. In addition, we have also considered the influence of training set size, classifier, classifier ensemble as well as the number of consecutive heartbeats in a majority voting scheme. The ECG signals used in this study were collected from thirty-nine subjects within the Physionet open access database. Public domain software was used for fiducial points detection. Results suggested that QT correction is indeed required to improve the performance. However, there is no clear choice among the seven explored approaches for QT correction (identification rate between 0.97 and 0.99). MultiLayer Perceptron and Support Vector Machine seemed to have better generalization capabilities, in terms of classification performance, with respect to Decision Tree-based classifiers. No such strong influence of the training-set size and the number of consecutive heartbeats has been observed on the majority voting scheme.
Resumo:
In the work [1] we proposed an approach of forming a consensus of experts’ statements in pattern recognition. In this paper, we present a method of aggregating sets of individual statements into a collective one for the case of forecasting of quantitative variable.
Resumo:
The paper has been presented at the 12th International Conference on Applications of Computer Algebra, Varna, Bulgaria, June, 2006
Resumo:
An embedding X ⊂ G of a topological space X into a topological group G is called functorial if every homeomorphism of X extends to a continuous group homomorphism of G. It is shown that the interval [0, 1] admits no functorial embedding into a finite-dimensional or metrizable topological group.
Resumo:
Non-preemptive two-machine flow-shop scheduling problem with uncertain processing times of n jobs is studied. In an uncertain version of a scheduling problem, there may not exist a unique schedule that remains optimal for all possible realizations of the job processing times. We find necessary and sufficient conditions (Theorem 1) when there exists a dominant permutation that is optimal for all possible realizations of the job processing times. Our computational studies show the percentage of the problems solvable under these conditions for the cases of randomly generated instances with n ≤ 100 . We also show how to use additional information about the processing times of the completed jobs during optimal realization of a schedule (Theorems 2 – 4). Computational studies for randomly generated instances with n ≤ 50 show the percentage of the two- machine flow-shop scheduling problems solvable under the sufficient conditions given in Theorems 2 – 4.
Resumo:
In this paper RDPPLan, a model for planning with quantitative resources specified as numerical intervals, is presented. Nearly all existing models of planning with resources require to specify exact values for updating resources modified by actions execution. In other words these models cannot deal with more realistic situations in which the resources quantities are not completely known but are bounded by intervals. The RDPPlan model allow to manage domains more tailored to real world, where preconditions and effects over quantitative resources can be specified by intervals of values, in addition mixed logical/quantitative and pure numerical goals can be posed. RDPPlan is based on non directional search over a planning graph, like DPPlan, from which it derives, it uses propagation rules which have been appropriately extended to the management of resource intervals. The propagation rules extended with resources must verify invariant properties over the planning graph which have been proven by the authors and guarantee the correctness of the approach. An implementation of the RDPPlan model is described with search strategies specifically developed for interval resources.
Resumo:
Despite the increasing body of evidence supporting the hypothesis of schizophrenia as a disconnection syndrome, studies of resting-state EEG Source Functional Connectivity (EEG-SFC) in people affected by schizophrenia are sparse. The aim of the present study was to investigate resting-state EEG-SFC in 77 stable, medicated patients with schizophrenia (SCZ) compared to 78 healthy volunteers (HV). In order to study the effect of illness duration, SCZ were divided in those with a short duration of disease (SDD; n = 25) and those with a long duration of disease (LDD; n = 52). Resting-state EEG recordings in eyes closed condition were analyzed and lagged phase synchronization (LPS) indices were calculated for each ROI pair in the source-space EEG data. In delta and theta bands, SCZ had greater EEG-SFC than HV; a higher theta band connectivity in frontal regions was observed in LDD compared with SDD. In the alpha band, SCZ showed lower frontal EEG-SFC compared with HV whereas no differences were found between LDD and SDD. In the beta1 band, SCZ had greater EEG-SFC compared with HVs and in the beta2 band, LDD presented lower frontal and parieto-temporal EEG-SFC compared with HV. In the gamma band, SDD had greater connectivity values compared with LDD and HV. This study suggests that resting state brain network connectivity is abnormally organized in schizophrenia, with different patterns for the different EEG frequency components and that EEG can be a powerful tool to further elucidate the complexity of such disordered connectivity.
Resumo:
Basic concepts for an interval arithmetic standard are discussed in the paper. Interval arithmetic deals with closed and connected sets of real numbers. Unlike floating-point arithmetic it is free of exceptions. A complete set of formulas to approximate real interval arithmetic on the computer is displayed in section 3 of the paper. The essential comparison relations and lattice operations are discussed in section 6. Evaluation of functions for interval arguments is studied in section 7. The desirability of variable length interval arithmetic is also discussed in the paper. The requirement to adapt the digital computer to the needs of interval arithmetic is as old as interval arithmetic. An obvious, simple possible solution is shown in section 8.