75 resultados para Time domain simulation tools


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Topic

To compare the accuracy of optical coherence tomography (OCT) with alternative tests for monitoring neovascular age-related macular degeneration (nAMD) and detecting disease activity among eyes previously treated for this condition.

Clinical Relevance

Traditionally, fundus fluorescein angiography (FFA) has been considered the reference standard to detect nAMD activity, but FFA is costly and invasive. Replacement of FFA by OCT can be justified if there is a substantial agreement between tests.

Methods

Systematic review and meta-analysis. The index test was OCT. The comparator tests were visual acuity, clinical evaluation (slit lamp), Amsler chart, color fundus photographs, infrared reflectance, red-free images and blue reflectance, fundus autofluorescence imaging, indocyanine green angiography (ICGA), preferential hyperacuity perimetry, and microperimetry. We searched the following databases: MEDLINE, MEDLINE In-Process, EMBASE, Biosis, Science Citation Index, the Cochrane Library, Database of Abstracts of Reviews of Effects, MEDION, and the Health Technology Assessment database. The last literature search was conducted in March 2013. We used the Quality Assessment of Diagnostic Accuracy Studies 2 (QUADAS-2) to assess risk of bias.

Results

We included 8 studies involving more than 400 participants. Seven reported the performance of OCT (3 time-domain [TD] OCT, 3 spectral-domain [SD] OCT, 1 both types) and 1 reported the performance of ICGA in the detection of nAMD activity. We did not find studies directly comparing tests in the same population. The pooled sensitivity and specificity of TD OCT and SD OCT for detecting active nAMD was 85% (95% confidence interval [CI], 72%–93%) and 48% (95% CI, 30%–67%), respectively. One study reported ICGA with sensitivity of 75.9% and specificity of 88.0% for the detection of active nAMD. Half of the studies were considered to have a high risk of bias.

Conclusions

There is substantial disagreement between OCT and FFA findings in detecting active disease in patients with nAMD who are being monitored. Both methods may be needed to monitor patients comprehensively with nAMD.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Age-related macular degeneration is the most common cause of sight impairment in the UK. In neovascular age-related macular degeneration (nAMD), vision worsens rapidly (over weeks) due to abnormal blood vessels developing that leak fluid and blood at the macula.

OBJECTIVES: To determine the optimal role of optical coherence tomography (OCT) in diagnosing people newly presenting with suspected nAMD and monitoring those previously diagnosed with the disease.

DATA SOURCES: Databases searched: MEDLINE (1946 to March 2013), MEDLINE In-Process & Other Non-Indexed Citations (March 2013), EMBASE (1988 to March 2013), Biosciences Information Service (1995 to March 2013), Science Citation Index (1995 to March 2013), The Cochrane Library (Issue 2 2013), Database of Abstracts of Reviews of Effects (inception to March 2013), Medion (inception to March 2013), Health Technology Assessment database (inception to March 2013).

REVIEW METHODS: Types of studies: direct/indirect studies reporting diagnostic outcomes.

INDEX TEST: time domain optical coherence tomography (TD-OCT) or spectral domain optical coherence tomography (SD-OCT).

COMPARATORS: clinical evaluation, visual acuity, Amsler grid, colour fundus photographs, infrared reflectance, red-free images/blue reflectance, fundus autofluorescence imaging, indocyanine green angiography, preferential hyperacuity perimetry, microperimetry. Reference standard: fundus fluorescein angiography (FFA). Risk of bias was assessed using quality assessment of diagnostic accuracy studies, version 2. Meta-analysis models were fitted using hierarchical summary receiver operating characteristic curves. A Markov model was developed (65-year-old cohort, nAMD prevalence 70%), with nine strategies for diagnosis and/or monitoring, and cost-utility analysis conducted. NHS and Personal Social Services perspective was adopted. Costs (2011/12 prices) and quality-adjusted life-years (QALYs) were discounted (3.5%). Deterministic and probabilistic sensitivity analyses were performed.

RESULTS: In pooled estimates of diagnostic studies (all TD-OCT), sensitivity and specificity [95% confidence interval (CI)] was 88% (46% to 98%) and 78% (64% to 88%) respectively. For monitoring, the pooled sensitivity and specificity (95% CI) was 85% (72% to 93%) and 48% (30% to 67%) respectively. The FFA for diagnosis and nurse-technician-led monitoring strategy had the lowest cost (£39,769; QALYs 10.473) and dominated all others except FFA for diagnosis and ophthalmologist-led monitoring (£44,649; QALYs 10.575; incremental cost-effectiveness ratio £47,768). The least costly strategy had a 46.4% probability of being cost-effective at £30,000 willingness-to-pay threshold.

LIMITATIONS: Very few studies provided sufficient information for inclusion in meta-analyses. Only a few studies reported other tests; for some tests no studies were identified. The modelling was hampered by a lack of data on the diagnostic accuracy of strategies involving several tests.

CONCLUSIONS: Based on a small body of evidence of variable quality, OCT had high sensitivity and moderate specificity for diagnosis, and relatively high sensitivity but low specificity for monitoring. Strategies involving OCT alone for diagnosis and/or monitoring were unlikely to be cost-effective. Further research is required on (i) the performance of SD-OCT compared with FFA, especially for monitoring but also for diagnosis; (ii) the performance of strategies involving combinations/sequences of tests, for diagnosis and monitoring; (iii) the likelihood of active and inactive nAMD becoming inactive or active respectively; and (iv) assessment of treatment-associated utility weights (e.g. decrements), through a preference-based study.

STUDY REGISTRATION: This study is registered as PROSPERO CRD42012001930.

FUNDING: The National Institute for Health Research Health Technology Assessment programme.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose:To determine the optimal role of OCT in diagnosing and monitoring nAMD (detecting disease activity and the need for further anti-VEGF treatment).
Methods:Systematic review. Major electronic databases and websites were searched. Studies were included if they reported the diagnostic performance of time domain or spectral domain OCT (or selected other tests) against a reference standard of ophthalmologist-interpreted fluorescein angiography in people with newly suspected or previously diagnosed nAMD. Risk of bias was assessed by two independent investigators using QUADAS-2. Summary receiver operating characteristic (SROC) curves were produced for each test given sufficient data.
Results:3700 titles/abstracts were screened, and 120 (3.2%) were selected for full-text assessment. A total of 22 studies were included (17 on diagnosis, 7 monitoring, and 3 both). From 15 studies reporting OCT data, sensitivity and specificity ranged from 59% to 100% and 27% to 100%, respectively.
Conclusions:The reported diagnostic performance of OCT showed large variability. The methodological quality of most studies was sub-optimal.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Efficient identification and follow-up of astronomical transients is hindered by the need for humans to manually select promising candidates from data streams that contain many false positives. These artefacts arise in the difference images that are produced by most major ground-based time-domain surveys with large format CCD cameras. This dependence on humans to reject bogus detections is unsustainable for next generation all-sky surveys and significant effort is now being invested to solve the problem computationally. In this paper, we explore a simple machine learning approach to real-bogus classification by constructing a training set from the image data of similar to 32 000 real astrophysical transients and bogus detections from the Pan-STARRS1 Medium Deep Survey. We derive our feature representation from the pixel intensity values of a 20 x 20 pixel stamp around the centre of the candidates. This differs from previous work in that it works directly on the pixels rather than catalogued domain knowledge for feature design or selection. Three machine learning algorithms are trained (artificial neural networks, support vector machines and random forests) and their performances are tested on a held-out subset of 25 per cent of the training data. We find the best results from the random forest classifier and demonstrate that by accepting a false positive rate of 1 per cent, the classifier initially suggests a missed detection rate of around 10 per cent. However, we also find that a combination of bright star variability, nuclear transients and uncertainty in human labelling means that our best estimate of the missed detection rate is approximately 6 per cent.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Time domain astronomy has come of age with astronomers now able to monitor the sky at high cadence both across the electromagnetic spectrum and using neutrinos and gravitational waves. The advent of new observing facilities permits new science, but the ever increasing throughput of facilities demands efficient communication of coincident detections and better subsequent coordination among the scientific community so as to turn detections into scientific discoveries. To discuss the revolution occurring in our ability to monitor the Universe and the challenges it brings, on 2012 April 25-26 a group of scientists from observational and theoretical teams studying transients met with representatives of the major international transient observing facilities at the Kavli Royal Society International Centre, UK. This immediately followed the Royal Society Discussion meeting "New windows on transients across the Universe" held in London. Here we present a summary of the Kavli meeting at which the participants discussed the science goals common to the transient astronomy community and analysed how to better meet the challenges ahead as ever more powerful observational facilities come on stream.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aqueous liquid mixtures, in particular, those involving amphiphilic species, play an important role in many physical, chemical and biological processes. Of particular interest are alcohol/water mixtures; however, the structural dynamics of such systems are still not fully understood. Herein, a combination of terahertz time-domain spectroscopy (THz-TDS) and NMR relaxation time analysis has been applied to investigate 2-propanol/water mixtures across the entire composition range; while neutron diffraction studies have been carried out at two specific concentrations. Excellent agreement is seen between the techniques with a maximum in both the relative absorption coefficient and the activation energy to molecular motion occurring at ∼90 mol% H2O. Furthermore, this is the same value at which well-established excess thermodynamic functions exhibit a maximum/minimum. Additionally, both neutron diffraction and THz-TDS have been used to provide estimates of the size of the hydration shell around 2-propanol in solution. Both methods determine that between 4 and 5 H2O molecules per 2-propanol are found in the 2-propanol/water clusters at 90 mol% H2O. Based on the acquired data, a description of the structure of 2-propanol/water across the composition range is presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The influence of polarization on the plasmon modes excited in tip-enhanced near-field optical microscopy have been investigated using the Finite Difference Time Domain Method. Analysis of the calculated results have laid particular emphasis on the ability to align local field enhancements with the orientation of molecules in order to optimize Raman signals, with particular relevance to recent experimental work on carbon nanotubes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present the Coordinated Synoptic Investigation of NGC 2264, a continuous 30 day multi-wavelength photometric monitoring campaign on more than 1000 young cluster members using 16 telescopes. The unprecedented combination of multi-wavelength, high-precision, high-cadence, and long-duration data opens a new window into the time domain behavior of young stellar objects. Here we provide an overview of the observations, focusing on results from Spitzer and CoRoT. The highlight of this work is detailed analysis of 162 classical T Tauri stars for which we can probe optical and mid-infrared flux variations to 1% amplitudes and sub-hour timescales. We present a morphological variability census and then use metrics of periodicity, stochasticity, and symmetry to statistically separate the light curves into seven distinct classes, which we suggest represent different physical processes and geometric effects. We provide distributions of the characteristic timescales and amplitudes and assess the fractional representation within each class. The largest category (>20%) are optical "dippers" with discrete fading events lasting ~1-5 days. The degree of correlation between the optical and infrared light curves is positive but weak; notably, the independently assigned optical and infrared morphology classes tend to be different for the same object. Assessment of flux variation behavior with respect to (circum)stellar properties reveals correlations of variability parameters with Hα emission and with effective temperature. Overall, our results point to multiple origins of young star variability, including circumstellar obscuration events, hot spots on the star and/or disk, accretion bursts, and rapid structural changes in the inner disk. Based on data from the Spitzer and CoRoT missions. The CoRoT space mission was developed and is operated by the French space agency CNES, with participation of ESA's RSSD and Science Programmes, Austria, Belgium, Brazil, Germany, and Spain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Densely deployed WiFi networks will play a crucial role in providing the capacity for next generation mobile internet. However, due to increasing interference, overlapped channels in WiFi networks and throughput efficiency degradation, densely deployed WiFi networks is not a guarantee to obtain higher throughput. An emergent challenge is how to efficiently utilize scarce spectrum resources, by matching physical layer resources to traffic demand. In this aspect, access control allocation strategies play a pivotal role but remain too coarse-grained. As a solution, this research proposes a flexible framework for fine-grained channel width adaptation and multi-channel access in WiFi networks. This approach, named SFCA (Sub-carrier Fine-grained Channel Access), adopts DOFDM (Discontinuous Orthogonal Frequency Division Multiplexing) at the PHY layer. It allocates the frequency resource with a sub-carrier granularity, which facilitates the channel width adaptation for multi-channel access and thus brings more flexibility and higher frequency efficiency. The MAC layer uses a frequency-time domain backoff scheme, which combines the popular time-domain BEB scheme with a frequency-domain backoff to decrease access collision, resulting in higher access probability for the contending nodes. SFCA is compared with FICA (an established access scheme) showing significant outperformance. Finally we present results for next generation 802.11ac WiFi networks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper discusses compact-stencil finite difference time domain (FDTD) schemes for approximating the 2D wave equation in the context of digital audio. Stability, accuracy, and efficiency are investigated and new ways of viewing and interpreting the results are discussed. It is shown that if a tight accuracy constraint is applied, implicit schemes outperform explicit schemes. The paper also discusses the relevance to digital waveguide mesh modelling, and highlights the optimally efficient explicit scheme.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Time-domain modelling of single-reed woodwind instruments usually involves a lumped model of the excitation mechanism. The parameters of this lumped model have to be estimated for use in numerical simulations. Several attempts have been made to estimate these parameters, including observations of the mechanics of isolated reeds, measurements under artificial or real playing conditions and estimations based on numerical simulations. In this study an optimisation routine is presented, that can estimate reed-model parameters, given the pressure and flow signals in the mouthpiece. The method is validated, tested on a series of numerically synthesised data. In order to incorporate the actions of the player in the parameter estimation process, the optimisation routine has to be applied to signals obtained under real playing conditions. The estimated parameters can then be used to resynthesise the pressure and flow signals in the mouthpiece. In the case of measured data, as opposed to numerically synthesised data, special care needs to be taken while modelling the bore of the instrument. In fact, a careful study of various experimental datasets revealed that for resynthesis to work, the bore termination impedance should be known very precisely from theory. An example is given, where the above requirement is satisfied, and the resynthesised signals closely match the original signals generated by the player.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we present a hybrid mixed cost-function adaptive initialization algorithm for the time domain equalizer in a discrete multitone (DMT)-based asymmetric digital subscriber loop. Using our approach, a higher convergence rate than that of the commonly used least-mean square algorithm is obtained, whilst attaining bit rates close to the optimum maximum shortening SNR and the upper bound SNR. Moreover, our proposed method outperforms the minimum mean-squared error design for a range of TEQ filter lengths.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of the latest generation of wide-body carbon-fibre composite passenger aircraft has heralded a new era in the utilisation of these materials. The premise of superior specific strength and stiffness, corrosion and fatigue resistance, is tempered by high development costs, slow production rates and lengthy and expensive certification programmes. Substantial effort is currently being directed towards the development of new modelling and simulation tools, at all levels of the development cycle, to mitigate these shortcomings. One of the primary challenges is to reduce the extent of physical testing, in the certification process, by adopting a ‘certification by simulation’ approach. In essence, this aspirational objective requires the ability to reliably predict the evolution and progression of damage in composites. The aerospace industry has been at the forefront of developing advanced composites modelling tools. As the automotive industry transitions towards the increased use of composites in mass-produced vehicles, similar challenges in the modelling of composites will need to be addressed, particularly in the reliable prediction of crashworthiness. While thermoset composites have dominated the aerospace industry, thermoplastics composites are likely to emerge as the preferred solution for meeting the high-volume production demands of passenger road vehicles. This keynote presentation will outline recent progress and current challenges in the development of finite-element-based predictive modelling tools for capturing impact damage, residual strength and energy absorption capacity of thermoset and thermoplastic composites for crashworthiness assessments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cryptographic algorithms have been designed to be computationally secure, however it has been shown that when they are implemented in hardware, that these devices leak side channel information that can be used to mount an attack that recovers the secret encryption key. In this paper an overlapping window power spectral density (PSD) side channel attack, targeting an FPGA device running the Advanced Encryption Standard is proposed. This improves upon previous research into PSD attacks by reducing the amount of pre-processing (effort) required. It is shown that the proposed overlapping window method requires less processing effort than that of using a sliding window approach, whilst overcoming the issues of sampling boundaries. The method is shown to be effective for both aligned and misaligned data sets and is therefore recommended as an improved approach in comparison with existing time domain based correlation attacks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Side channel attacks permit the recovery of the secret key held within a cryptographic device. This paper presents a new EM attack in the frequency domain, using a power spectral density analysis that permits the use of variable spectral window widths for each trace of the data set and demonstrates how this attack can therefore overcome both inter-and intra-round random insertion type countermeasures. We also propose a novel re-alignment method exploiting the minimal power markers exhibited by electromagnetic emanations. The technique can be used for the extraction and re-alignment of round data in the time domain.