989 resultados para Timing analysis
Resumo:
Context. The early-type binary MY Cam belongs to the young open cluster Alicante 1, embedded in Cam OB3. Aims. MY Cam consists of two early-O type main-sequence stars and shows a photometric modulation suggesting an orbital period slightly above one day. We intend to confirm this orbital period and derive orbital and stellar parameters. Methods. Timing analysis of a very exhaustive (4607 points) light curve indicates a period of 1.1754514 ± 0.0000015 d. High-resolution spectra and the cross-correlation technique implemented in the todcor program were used to derive radial velocities and obtain the corresponding radial velocity curves for MY Cam. Modelling with the stellar atmosphere code fastwind was used to obtain stellar parameters and create templates for cross-correlation. Stellar and orbital parameters were derived using the Wilson-Devinney code, such that a complete solution to the binary system could be described. Results. The determined masses of the primary and secondary stars in MY Cam are 37.7 ± 1.6 and 31.6 ± 1.4M⊙, respectively. The corresponding temperatures, derived from the model atmosphere fit, are 42 000 and 39 000 K, with the more massive component being hotter. Both stars are overfilling their Roche lobes, sharing a common envelope. Conclusions. MY Cam contains the most massive dwarf O-type stars found so far in an eclipsing binary. Both components are still on the main sequence, and probably not far from the zero-age main sequence. The system is a likely merger progenitor, owing to its very short period.
Resumo:
In 2013 April a new magnetar, SGR 1745−2900, was discovered as it entered an outburst, at only 2.4 arcsec angular distance from the supermassive black hole at the centre of the Milky Way, Sagittarius A*. SGR 1745−2900 has a surface dipolar magnetic field of ∼2 × 1014 G, and it is the neutron star closest to a black hole ever observed. The new source was detected both in the radio and X-ray bands, with a peak X-ray luminosity LX ∼ 5 × 1035 erg s−1. Here we report on the long-term Chandra (25 observations) and XMM–Newton (eight observations) X-ray monitoring campaign of SGR 1745−2900 from the onset of the outburst in 2013 April until 2014 September. This unprecedented data set allows us to refine the timing properties of the source, as well as to study the outburst spectral evolution as a function of time and rotational phase. Our timing analysis confirms the increase in the spin period derivative by a factor of ∼2 around 2013 June, and reveals that a further increase occurred between 2013 October 30 and 2014 February 21. We find that the period derivative changed from 6.6 × 10−12 to 3.3 × 10−11 s s−1 in 1.5 yr. On the other hand, this magnetar shows a slow flux decay compared to other magnetars and a rather inefficient surface cooling. In particular, starquake-induced crustal cooling models alone have difficulty in explaining the high luminosity of the source for the first ∼200 d of its outburst, and additional heating of the star surface from currents flowing in a twisted magnetic bundle is probably playing an important role in the outburst evolution.
Resumo:
The unprecedented sensitivity and large field of view of SKA will be of paramount importance for pulsar science, and for many related research fields. In particular, beside the obvious discovery of many more pulsars (even those with very low luminosity), and the extremely accurate timing analysis of the current pulsar population, SKA will allow to use pulsars to measure or put strong constraints on gravitational waves, Galactic magnetism, planet masses, general relativity and nuclear physics.
Resumo:
We provide an abstract command language for real-time programs and outline how a partial correctness semantics can be used to compute execution times. The notions of a timed command, refinement of a timed command, the command traversal condition, and the worst-case and best-case execution time of a command are formally introduced and investigated with the help of an underlying weakest liberal precondition semantics. The central result is a theory for the computation of worst-case and best-case execution times from the underlying semantics based on supremum and infimum calculations. The framework is applied to the analysis of a message transmitter program and its implementation. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
We propose a method for the timing analysis of concurrent real-time programs with hard deadlines. We divide the analysis into a machine-independent and a machine-dependent task. The latter takes into account the execution times of the program on a particular machine. Therefore, our goal is to make the machine-dependent phase of the analysis as simple as possible. We succeed in the sense that the machine-dependent phase remains the same as in the analysis of sequential programs. We shift the complexity introduced by concurrency completely to the machine-independent phase.
Resumo:
ACM Computing Classification System (1998): G.2.2, F.2.2.
Resumo:
As the development of a viable quantum computer nears, existing widely used public-key cryptosystems, such as RSA, will no longer be secure. Thus, significant effort is being invested into post-quantum cryptography (PQC). Lattice-based cryptography (LBC) is one such promising area of PQC, which offers versatile, efficient, and high performance security services. However, the vulnerabilities of these implementations against side-channel attacks (SCA) remain significantly understudied. Most, if not all, lattice-based cryptosystems require noise samples generated from a discrete Gaussian distribution, and a successful timing analysis attack can render the whole cryptosystem broken, making the discrete Gaussian sampler the most vulnerable module to SCA. This research proposes countermeasures against timing information leakage with FPGA-based designs of the CDT-based discrete Gaussian samplers with constant response time, targeting encryption and signature scheme parameters. The proposed designs are compared against the state-of-the-art and are shown to significantly outperform existing implementations. For encryption, the proposed sampler is 9x faster in comparison to the only other existing time-independent CDT sampler design. For signatures, the first time-independent CDT sampler in hardware is proposed.
Resumo:
The introduction of time-series graphs into British economics in the 19th century depended on the « timing » of history. This involved reconceptualizing history into events which were both comparable and measurable and standardized by time unit. Yet classical economists in Britain in the early 19th century viewed history as a set of heterogenous and complex events and statistical tables as giving unrelated facts. Both these attitudes had to be broken down before time-series graphs could be brought into use for revealing regularities in economic events by the century's end.
Resumo:
OBJECTIVES: To develop a method for objective assessment of fine motor timing variability in Parkinson’s disease (PD) patients, using digital spiral data gathered by a touch screen device. BACKGROUND: A retrospective analysis was conducted on data from 105 subjects including65 patients with advanced PD (group A), 15 intermediate patients experiencing motor fluctuations (group I), 15 early stage patients (group S), and 10 healthy elderly subjects (HE) were examined. The subjects were asked to perform repeated upper limb motor tasks by tracing a pre-drawn Archimedes spiral as shown on the screen of the device. The spiral tracing test was performed using an ergonomic pen stylus, using dominant hand. The test was repeated three times per test occasion and the subjects were instructed to complete it within 10 seconds. Digital spiral data including stylus position (x-ycoordinates) and timestamps (milliseconds) were collected and used in subsequent analysis. The total number of observations with the test battery were as follows: Swedish group (n=10079), Italian I group (n=822), Italian S group (n = 811), and HE (n=299). METHODS: The raw spiral data were processed with three data processing methods. To quantify motor timing variability during spiral drawing tasks Approximate Entropy (APEN) method was applied on digitized spiral data. APEN is designed to capture the amount of irregularity or complexity in time series. APEN requires determination of two parameters, namely, the window size and similarity measure. In our work and after experimentation, window size was set to 4 and similarity measure to 0.2 (20% of the standard deviation of the time series). The final score obtained by APEN was normalized by total drawing completion time and used in subsequent analysis. The score generated by this method is hence on denoted APEN. In addition, two more methods were applied on digital spiral data and their scores were used in subsequent analysis. The first method was based on Digital Wavelet Transform and Principal Component Analysis and generated a score representing spiral drawing impairment. The score generated by this method is hence on denoted WAV. The second method was based on standard deviation of frequency filtered drawing velocity. The score generated by this method is hence on denoted SDDV. Linear mixed-effects (LME) models were used to evaluate mean differences of the spiral scores of the three methods across the four subject groups. Test-retest reliability of the three scores was assessed after taking mean of the three possible correlations (Spearman’s rank coefficients) between the three test trials. Internal consistency of the methods was assessed by calculating correlations between their scores. RESULTS: When comparing mean spiral scores between the four subject groups, the APEN scores were different between HE subjects and three patient groups (P=0.626 for S group with 9.9% mean value difference, P=0.089 for I group with 30.2%, and P=0.0019 for A group with 44.1%). However, there were no significant differences in mean scores of the other two methods, except for the WAV between the HE and A groups (P<0.001). WAV and SDDV were highly and significantly correlated to each other with a coefficient of 0.69. However, APEN was not correlated to neither WAV nor SDDV with coefficients of 0.11 and 0.12, respectively. Test-retest reliability coefficients of the three scores were as follows: APEN (0.9), WAV(0.83) and SD-DV (0.55). CONCLUSIONS: The results show that the digital spiral analysis-based objective APEN measure is able to significantly differentiate the healthy subjects from patients at advanced level. In contrast to the other two methods (WAV and SDDV) that are designed to quantify dyskinesias (over-medications), this method can be useful for characterizing Off symptoms in PD. The APEN was not correlated to none of the other two methods indicating that it measures a different construct of upper limb motor function in PD patients than WAV and SDDV. The APEN also had a better test-retest reliability indicating that it is more stable and consistent over time than WAV and SDDV.
Resumo:
BACKGROUND: The CD4 cell count at which combination antiretroviral therapy should be started is a central, unresolved issue in the care of HIV-1-infected patients. In the absence of randomised trials, we examined this question in prospective cohort studies. METHODS: We analysed data from 18 cohort studies of patients with HIV. Antiretroviral-naive patients from 15 of these studies were eligible for inclusion if they had started combination antiretroviral therapy (while AIDS-free, with a CD4 cell count less than 550 cells per microL, and with no history of injecting drug use) on or after Jan 1, 1998. We used data from patients followed up in seven of the cohorts in the era before the introduction of combination therapy (1989-95) to estimate distributions of lead times (from the first CD4 cell count measurement in an upper range to the upper threshold of a lower range) and unseen AIDS and death events (occurring before the upper threshold of a lower CD4 cell count range is reached) in the absence of treatment. These estimations were used to impute completed datasets in which lead times and unseen AIDS and death events were added to data for treated patients in deferred therapy groups. We compared the effect of deferred initiation of combination therapy with immediate initiation on rates of AIDS and death, and on death alone, in adjacent CD4 cell count ranges of width 100 cells per microL. FINDINGS: Data were obtained for 21 247 patients who were followed up during the era before the introduction of combination therapy and 24 444 patients who were followed up from the start of treatment. Deferring combination therapy until a CD4 cell count of 251-350 cells per microL was associated with higher rates of AIDS and death than starting therapy in the range 351-450 cells per microL (hazard ratio [HR] 1.28, 95% CI 1.04-1.57). The adverse effect of deferring treatment increased with decreasing CD4 cell count threshold. Deferred initiation of combination therapy was also associated with higher mortality rates, although effects on mortality were less marked than effects on AIDS and death (HR 1.13, 0.80-1.60, for deferred initiation of treatment at CD4 cell count 251-350 cells per microL compared with initiation at 351-450 cells per microL). INTERPRETATION: Our results suggest that 350 cells per microL should be the minimum threshold for initiation of antiretroviral therapy, and should help to guide physicians and patients in deciding when to start treatment.
Treatment of open hand injuries: does timing of surgery matter? A single-centre prospective analysis
Resumo:
One of the earliest accounts of duration perception by Karl von Vierordt implied a common process underlying the timing of intervals in the sub-second and the second range. To date, there are two major explanatory approaches for the timing of brief intervals: the Common Timing Hypothesis and the Distinct Timing Hypothesis. While the common timing hypothesis also proceeds from a unitary timing process, the distinct timing hypothesis suggests two dissociable, independent mechanisms for the timing of intervals in the sub-second and the second range, respectively. In the present paper, we introduce confirmatory factor analysis (CFA) to elucidate the internal structure of interval timing in the sub-second and the second range. Our results indicate that the assumption of two mechanisms underlying the processing of intervals in the second and the sub-second range might be more appropriate than the assumption of a unitary timing mechanism. In contrast to the basic assumption of the distinct timing hypothesis, however, these two timing mechanisms are closely associated with each other and share 77% of common variance. This finding suggests either a strong functional relationship between the two timing mechanisms or a hierarchically organized internal structure. Findings are discussed in the light of existing psychophysical and neurophysiological data.
Resumo:
The most influential theoretical account in time psychophysics assumes the existence of a unitary internal clock based on neural counting. The distinct timing hypothesis, on the other hand, suggests an automatic timing mechanism for processing of durations in the sub-second range and a cognitively controlled timing mechanism for processing of durations in the range of seconds. Although several psychophysical approaches can be applied for identifying the internal structure of interval timing in the second and sub-second range, the existing data provide a puzzling picture of rather inconsistent results. In the present chapter, we introduce confirmatory factor analysis (CFA) to further elucidate the internal structure of interval timing performance in the sub-second and second range. More specifically, we investigated whether CFA would rather support the notion of a unitary timing mechanism or of distinct timing mechanisms underlying interval timing in the sub-second and second range, respectively. The assumption of two distinct timing mechanisms which are completely independent of each other was not supported by our data. The model assuming a unitary timing mechanism underlying interval timing in both the sub-second and second range fitted the empirical data much better. Eventually, we also tested a third model assuming two distinct, but functionally related mechanisms. The correlation between the two latent variables representing the hypothesized timing mechanisms was rather high and comparison of fit indices indicated that the assumption of two associated timing mechanisms described the observed data better than only one latent variable. Models are discussed in the light of the existing psychophysical and neurophysiological data.
Resumo:
The replication initiation protein Cdc6p forms a tight complex with Cdc28p, specifically with forms of the kinase that are competent to promote replication initiation. We now show that potential sites of Cdc28 phosphorylation in Cdc6p are required for the regulated destruction of Cdc6p that has been shown to occur during the Saccharomyces cerevisiae cell cycle. Analysis of Cdc6p phosphorylation site mutants and of the requirement for Cdc28p in an in vitro ubiquitination system suggests that targeting of Cdc6p for degradation is more complex than previously proposed. First, phosphorylation of N-terminal sites targets Cdc6p for polyubiquitination probably, as expected, through promoting interaction with Cdc4p, an F box protein involved in substrate recognition by the Skp1-Cdc53-F-box protein (SCF) ubiquitin ligase. However, in addition, mutation of a single, C-terminal site stabilizes Cdc6p in G2 phase cells without affecting substrate recognition by SCF in vitro, demonstrating a second and novel requirement for specific phosphorylation in degradation of Cdc6p. SCF-Cdc4p– and N-terminal phosphorylation site–dependent ubiquitination appears to be mediated preferentially by Clbp/Cdc28p complexes rather than by Clnp/Cdc28ps, suggesting a way in which phosphorylation of Cdc6p might control the timing of its degradation at then end of G1 phase of the cell cycle. The stable cdc6 mutants show no apparent replication defects in wild-type strains. However, stabilization through mutation of three N-terminal phosphorylation sites or of the single C-terminal phosphorylation site leads to dominant lethality when combined with certain mutations in the anaphase-promoting complex.