993 resultados para Burr Conspiracy, 1805-1807.
Resumo:
The purpose is to study the diagnostic performance of optical coherence tomography (OCT) and alternative diagnostic tests for neovascular age-related macular degeneration (nAMD). Methods employed are as follows:systematic review and meta-analysis; Index test: OCT including time-domain (TD-OCT) and the most recently developed spectral domain (SD-OCT); comparator tests: visual acuity, clinical evaluation (slit lamp), Amsler chart, colour fundus photographs, infra-red reflectance, red-free images/blue reflectance, fundus autofluorescence imaging (FAF), indocyanine green angiography (ICGA), preferential hyperacuity perimetry (PHP), and microperimetry; reference standard: fundus fluorescein angiography. Databases searched included MEDLINE, MEDLINE In Process, EMBASE, Biosis, SCI, the Cochrane Library, DARE, MEDION, and HTA database. Last literature searches: March 2013. Risk of bias assessed using QUADAS-2. Meta-analysis models were fitted using hierarchical summary receiver operating characteristic (HSROC) curves. Twenty-two studies (2 abstracts and 20 articles) enrolling 2124 participants were identified, reporting TD-OCT (12 studies), SD-OCT (1 study), ICGA (8 studies), PHP (3 studies), Amsler grid, colour fundus photography and FAF (1 study each). Most studies were considered to have a high risk of bias in the patient selection (55%, 11/20), and flow and timing (40%, 8/20) domains. In a meta-analysis of TD-OCT studies, sensitivity and specificity (95% CI) were 88% (46–98%) and 78% (64–88%), respectively. There was insufficient information to undertake meta-analysis for other tests. TD-OCT is a sensitive test for detecting nAMD, although specificity was only moderate. Data on SD-OCT are sparse. Diagnosis of nAMD should not rely solely on OCT.
Resumo:
Topic
To compare the accuracy of optical coherence tomography (OCT) with alternative tests for monitoring neovascular age-related macular degeneration (nAMD) and detecting disease activity among eyes previously treated for this condition.
Clinical RelevanceTraditionally, fundus fluorescein angiography (FFA) has been considered the reference standard to detect nAMD activity, but FFA is costly and invasive. Replacement of FFA by OCT can be justified if there is a substantial agreement between tests.
MethodsSystematic review and meta-analysis. The index test was OCT. The comparator tests were visual acuity, clinical evaluation (slit lamp), Amsler chart, color fundus photographs, infrared reflectance, red-free images and blue reflectance, fundus autofluorescence imaging, indocyanine green angiography (ICGA), preferential hyperacuity perimetry, and microperimetry. We searched the following databases: MEDLINE, MEDLINE In-Process, EMBASE, Biosis, Science Citation Index, the Cochrane Library, Database of Abstracts of Reviews of Effects, MEDION, and the Health Technology Assessment database. The last literature search was conducted in March 2013. We used the Quality Assessment of Diagnostic Accuracy Studies 2 (QUADAS-2) to assess risk of bias.
ResultsWe included 8 studies involving more than 400 participants. Seven reported the performance of OCT (3 time-domain [TD] OCT, 3 spectral-domain [SD] OCT, 1 both types) and 1 reported the performance of ICGA in the detection of nAMD activity. We did not find studies directly comparing tests in the same population. The pooled sensitivity and specificity of TD OCT and SD OCT for detecting active nAMD was 85% (95% confidence interval [CI], 72%–93%) and 48% (95% CI, 30%–67%), respectively. One study reported ICGA with sensitivity of 75.9% and specificity of 88.0% for the detection of active nAMD. Half of the studies were considered to have a high risk of bias.
ConclusionsThere is substantial disagreement between OCT and FFA findings in detecting active disease in patients with nAMD who are being monitored. Both methods may be needed to monitor patients comprehensively with nAMD.
Resumo:
BACKGROUND: Age-related macular degeneration is the most common cause of sight impairment in the UK. In neovascular age-related macular degeneration (nAMD), vision worsens rapidly (over weeks) due to abnormal blood vessels developing that leak fluid and blood at the macula.
OBJECTIVES: To determine the optimal role of optical coherence tomography (OCT) in diagnosing people newly presenting with suspected nAMD and monitoring those previously diagnosed with the disease.
DATA SOURCES: Databases searched: MEDLINE (1946 to March 2013), MEDLINE In-Process & Other Non-Indexed Citations (March 2013), EMBASE (1988 to March 2013), Biosciences Information Service (1995 to March 2013), Science Citation Index (1995 to March 2013), The Cochrane Library (Issue 2 2013), Database of Abstracts of Reviews of Effects (inception to March 2013), Medion (inception to March 2013), Health Technology Assessment database (inception to March 2013).
REVIEW METHODS: Types of studies: direct/indirect studies reporting diagnostic outcomes.
INDEX TEST: time domain optical coherence tomography (TD-OCT) or spectral domain optical coherence tomography (SD-OCT).
COMPARATORS: clinical evaluation, visual acuity, Amsler grid, colour fundus photographs, infrared reflectance, red-free images/blue reflectance, fundus autofluorescence imaging, indocyanine green angiography, preferential hyperacuity perimetry, microperimetry. Reference standard: fundus fluorescein angiography (FFA). Risk of bias was assessed using quality assessment of diagnostic accuracy studies, version 2. Meta-analysis models were fitted using hierarchical summary receiver operating characteristic curves. A Markov model was developed (65-year-old cohort, nAMD prevalence 70%), with nine strategies for diagnosis and/or monitoring, and cost-utility analysis conducted. NHS and Personal Social Services perspective was adopted. Costs (2011/12 prices) and quality-adjusted life-years (QALYs) were discounted (3.5%). Deterministic and probabilistic sensitivity analyses were performed.
RESULTS: In pooled estimates of diagnostic studies (all TD-OCT), sensitivity and specificity [95% confidence interval (CI)] was 88% (46% to 98%) and 78% (64% to 88%) respectively. For monitoring, the pooled sensitivity and specificity (95% CI) was 85% (72% to 93%) and 48% (30% to 67%) respectively. The FFA for diagnosis and nurse-technician-led monitoring strategy had the lowest cost (£39,769; QALYs 10.473) and dominated all others except FFA for diagnosis and ophthalmologist-led monitoring (£44,649; QALYs 10.575; incremental cost-effectiveness ratio £47,768). The least costly strategy had a 46.4% probability of being cost-effective at £30,000 willingness-to-pay threshold.
LIMITATIONS: Very few studies provided sufficient information for inclusion in meta-analyses. Only a few studies reported other tests; for some tests no studies were identified. The modelling was hampered by a lack of data on the diagnostic accuracy of strategies involving several tests.
CONCLUSIONS: Based on a small body of evidence of variable quality, OCT had high sensitivity and moderate specificity for diagnosis, and relatively high sensitivity but low specificity for monitoring. Strategies involving OCT alone for diagnosis and/or monitoring were unlikely to be cost-effective. Further research is required on (i) the performance of SD-OCT compared with FFA, especially for monitoring but also for diagnosis; (ii) the performance of strategies involving combinations/sequences of tests, for diagnosis and monitoring; (iii) the likelihood of active and inactive nAMD becoming inactive or active respectively; and (iv) assessment of treatment-associated utility weights (e.g. decrements), through a preference-based study.
STUDY REGISTRATION: This study is registered as PROSPERO CRD42012001930.
FUNDING: The National Institute for Health Research Health Technology Assessment programme.
Resumo:
Purpose:To determine the optimal role of OCT in diagnosing and monitoring nAMD (detecting disease activity and the need for further anti-VEGF treatment).
Methods:Systematic review. Major electronic databases and websites were searched. Studies were included if they reported the diagnostic performance of time domain or spectral domain OCT (or selected other tests) against a reference standard of ophthalmologist-interpreted fluorescein angiography in people with newly suspected or previously diagnosed nAMD. Risk of bias was assessed by two independent investigators using QUADAS-2. Summary receiver operating characteristic (SROC) curves were produced for each test given sufficient data.
Results:3700 titles/abstracts were screened, and 120 (3.2%) were selected for full-text assessment. A total of 22 studies were included (17 on diagnosis, 7 monitoring, and 3 both). From 15 studies reporting OCT data, sensitivity and specificity ranged from 59% to 100% and 27% to 100%, respectively.
Conclusions:The reported diagnostic performance of OCT showed large variability. The methodological quality of most studies was sub-optimal.
Resumo:
BACKGROUND: Glaucoma is a leading cause of avoidable blindness worldwide. Open angle glaucoma is the most common type of glaucoma. No randomised controlled trials have been conducted evaluating the effectiveness of glaucoma screening for reducing sight loss. It is unclear what the most appropriate intervention to be evaluated in any glaucoma screening trial would be. The purpose of this study was to develop the clinical components of an intervention for evaluation in a glaucoma (open angle) screening trial that would be feasible and acceptable in a UK eye-care service.
METHODS: A mixed-methods study, based on the Medical Research Council (MRC) framework for complex interventions, integrating qualitative (semi-structured interviews with 46 UK eye-care providers, policy makers and health service commissioners), and quantitative (economic modelling) methods. Interview data were synthesised and used to revise the screening interventions compared within an existing economic model.
RESULTS: The qualitative data indicated broad based support for a glaucoma screening trial to take place in primary care, using ophthalmic trained technical assistants supported by optometry input. The precise location should be tailored to local circumstances. There was variability in opinion around the choice of screening test and target population. Integrating the interview findings with cost-effectiveness criteria reduced 189 potential components to a two test intervention including either optic nerve photography or screening mode perimetry (a measure of visual field sensitivity) with or without tonometry (a measure of intraocular pressure). It would be more cost-effective, and thus acceptable in a policy context, to target screening for open angle glaucoma to those at highest risk but for both practicality and equity arguments the optimal strategy was screening a general population cohort beginning at age forty.
CONCLUSIONS: Interventions for screening for open angle glaucoma that would be feasible from a service delivery perspective were identified. Integration within an economic modelling framework explicitly highlighted the trade-off between cost-effectiveness, feasibility and equity. This study exemplifies the MRC recommendation to integrate qualitative and quantitative methods in developing complex interventions. The next step in the development pathway should encompass the views of service users.
Resumo:
his essay is premised on the following: a conspiracy to fix or otherwise manipulate the outcome of a sporting event for profitable purpose. That conspiracy is in turn predicated on the conspirators’ capacity to: (a) ensure that the fix takes place as pre-determined; (b) manipulate the betting markets that surround the sporting event in question; and (c) collect their winnings undetected by either the betting industry’s security systems or the attention of any national regulatory body or law enforcement agency.
Unlike many essays on this topic, this contribution does not focus on the “fix”– part (a) of the above equation. It does not seek to explain how or why a participant or sports official might facilitate a betting scam through either on-field behaviour that manipulates the outcome of a game or by presenting others with privileged inside information in advance of a game. Neither does this contribution seek to give any real insight into the second part of the above equation: how such conspirators manipulate a sports betting market by playing or laying the handicap or in-play or other offered betting odds. In fact, this contribution is not really about the mechanics of sports betting or match fixing at all; rather it is about the sometimes under explained reason why match fixing has reportedly become increasingly attractive as of late to international crime syndicates. That reason relates to the fact that given the traditional liquidity of gambling markets, sports betting can, and has long been, an attractively accessible conduit for criminal syndicates to launder the proceeds of crime. Accordingly, the term “winnings”, noted in part (c) of the above equation, takes on an altogether more nefarious meaning.
This essay’s attempt to review the possible links between match fixing in sport, gambling-related “winnings” and money laundering is presented in four parts.
First, some context will be given to what is meant by money laundering, how it is currently policed internationally and, most importantly, how the growth of online gambling presents a unique set of vulnerabilities and opportunities to launder the proceeds of crime. The globalisation of organised crime, sports betting and transnational financial services now means that money laundering opportunities have moved well beyond a flutter on the horses at your local racetrack or at the roulette table of your nearest casino. The growth of online gambling platforms means that at a click it is possible for the proceeds of crime in one jurisdiction to be placed on a betting market in another jurisdiction with the winnings drawn down and laundered in a third jurisdiction and thus the internationalisation of gambling-related money laundering threatens the integrity of sport globally.
Second, and referring back to the infamous hearings of the US Senate Special Committee to Investigate Organised Crime in Interstate Commerce of the early 1950s, (“the Kefauver Committee”), this article will begin by illustrating the long standing interest of organised crime gangs – in this instance, various Mafia families in the United States – in money laundering via sports gambling-related means.
Third, and using the seminal 2009 report “Money Laundering through the Football Sector” by the Financial Action Task Force (FATF, an inter-governmental body established in 1989 to promote effective implementation of legal, regulatory and operational measures for combating money laundering, terrorist financing and other related threats to the integrity of the international financial system), this essay seeks to assess the vulnerabilities of international sport to match fixing, as motivated in part by the associated secondary criminality of tax evasion and transnational economic crime.
The fourth and concluding parts of the essay spin from problems to possible solutions. The underlying premise here is that heretofore there has been an insularity to the way that sports organisations have both conceptualised and sought to address the match fixing threat e.g., if we (in sport) initiate player education programmes; establish integrity units; enforce codes of conduct and sanctions strictly; then our integrity or brand should be protected. This essay argues that, although these initiatives are important, the source and process of match fixing is beyond sport’s current capacity, as are the possible solutions.
Resumo:
We present a pilot study that uses the radiocarbon (∆14C) method to determine the source of carbon buried in the surface sediment of Lough Erne, a humic, alkaline lake in northwest Ireland. ∆14C, δ13C and δ15N values were measured from phytoplankton, dissolved inorganic, dissolved organic and particulate organic carbon. A novel radiocarbon method, Stepped Combustion1 was used to estimate the degree of the burial of terrestrial carbon in surface sediment. The ∆14C values of the low temperature fractions were comparable to algal ∆14C, while the high temperature fractions were 14C-depleted (older than bulk sediment). The ∆14C end-member model indicated that ~64% of carbon in surface sediment was derived from detrital terrestrial carbon. The use of ∆14C in conjunction with stepped combustion allows the quantification of the pathways of terrestrial carbon in the system, which has implications for regional and global carbon burial.
1McGeehin, J., Burr, G.S., Jull, A.J.T., Reines, D., Gosse, J., Davis, P.T., Muhs, D., and Southon, J.R., 2001, Stepped-combustion C-14 dating of sediment: A comparison with established techniques: Radiocarbon, v. 43, p. 255-261.
Resumo:
Background: Long working hours might increase the risk of cardiovascular disease, but prospective evidence is scarce, imprecise, and mostly limited to coronary heart disease. We aimed to assess long working hours as a risk factor for incident coronary heart disease and stroke.
Methods We identified published studies through a systematic review of PubMed and Embase from inception to Aug 20, 2014. We obtained unpublished data for 20 cohort studies from the Individual-Participant-Data Meta-analysis in Working Populations (IPD-Work) Consortium and open-access data archives. We used cumulative random-effects meta-analysis to combine effect estimates from published and unpublished data.
Findings We included 25 studies from 24 cohorts in Europe, the USA, and Australia. The meta-analysis of coronary heart disease comprised data for 603 838 men and women who were free from coronary heart disease at baseline; the meta-analysis of stroke comprised data for 528 908 men and women who were free from stroke at baseline. Follow-up for coronary heart disease was 5·1 million person-years (mean 8·5 years), in which 4768 events were recorded, and for stroke was 3·8 million person-years (mean 7·2 years), in which 1722 events were recorded. In cumulative meta-analysis adjusted for age, sex, and socioeconomic status, compared with standard hours (35-40 h per week), working long hours (≥55 h per week) was associated with an increase in risk of incident coronary heart disease (relative risk [RR] 1·13, 95% CI 1·02-1·26; p=0·02) and incident stroke (1·33, 1·11-1·61; p=0·002). The excess risk of stroke remained unchanged in analyses that addressed reverse causation, multivariable adjustments for other risk factors, and different methods of stroke ascertainment (range of RR estimates 1·30-1·42). We recorded a dose-response association for stroke, with RR estimates of 1·10 (95% CI 0·94-1·28; p=0·24) for 41-48 working hours, 1·27 (1·03-1·56; p=0·03) for 49-54 working hours, and 1·33 (1·11-1·61; p=0·002) for 55 working hours or more per week compared with standard working hours (ptrend<0·0001).
Interpretation Employees who work long hours have a higher risk of stroke than those working standard hours; the association with coronary heart disease is weaker. These findings suggest that more attention should be paid to the management of vascular risk factors in individuals who work long hours.
Resumo:
With the rapid development of internet-of-things (IoT), face scrambling has been proposed for privacy protection during IoT-targeted image/video distribution. Consequently in these IoT applications, biometric verification needs to be carried out in the scrambled domain, presenting significant challenges in face recognition. Since face models become chaotic signals after scrambling/encryption, a typical solution is to utilize traditional data-driven face recognition algorithms. While chaotic pattern recognition is still a challenging task, in this paper we propose a new ensemble approach – Many-Kernel Random Discriminant Analysis (MK-RDA) to discover discriminative patterns from chaotic signals. We also incorporate a salience-aware strategy into the proposed ensemble method to handle chaotic facial patterns in the scrambled domain, where random selections of features are made on semantic components via salience modelling. In our experiments, the proposed MK-RDA was tested rigorously on three human face datasets: the ORL face dataset, the PIE face dataset and the PUBFIG wild face dataset. The experimental results successfully demonstrate that the proposed scheme can effectively handle chaotic signals and significantly improve the recognition accuracy, making our method a promising candidate for secure biometric verification in emerging IoT applications.
Resumo:
OBJECTIVE: To assess the efficiency of alternative monitoring services for people with ocular hypertension (OHT), a glaucoma risk factor.
DESIGN: Discrete event simulation model comparing five alternative care pathways: treatment at OHT diagnosis with minimal monitoring; biennial monitoring (primary and secondary care) with treatment if baseline predicted 5-year glaucoma risk is ≥6%; monitoring and treatment aligned to National Institute for Health and Care Excellence (NICE) glaucoma guidance (conservative and intensive).
SETTING: UK health services perspective.
PARTICIPANTS: Simulated cohort of 10 000 adults with OHT (mean intraocular pressure (IOP) 24.9 mm Hg (SD 2.4).
MAIN OUTCOME MEASURES: Costs, glaucoma detected, quality-adjusted life years (QALYs).
RESULTS: Treating at diagnosis was the least costly and least effective in avoiding glaucoma and progression. Intensive monitoring following NICE guidance was the most costly and effective. However, considering a wider cost-utility perspective, biennial monitoring was less costly and provided more QALYs than NICE pathways, but was unlikely to be cost-effective compared with treating at diagnosis (£86 717 per additional QALY gained). The findings were robust to risk thresholds for initiating monitoring but were sensitive to treatment threshold, National Health Service costs and treatment adherence.
CONCLUSIONS: For confirmed OHT, glaucoma monitoring more frequently than every 2 years is unlikely to be efficient. Primary treatment and minimal monitoring (assessing treatment responsiveness (IOP)) could be considered; however, further data to refine glaucoma risk prediction models and value patient preferences for treatment are needed. Consideration to innovative and affordable service redesign focused on treatment responsiveness rather than more glaucoma testing is recommended.
Resumo:
Os escritos de Johann Georg Tromlitz (1725-1805) são uma fonte inestimável de informação e inspiração para o actual intérprete e estudioso. Entre estes destaco o Ausfühlicher und gründlicher Unterricht die Flöte zu spielen [Método minucioso e detalhado para tocar flauta], 1791, claramente inspirado no tratado de Quantz de 1752, mas reflectindo a necessidade que o autor teve de o actualizar. Este tratado de Tromlitz foi projectado para que o leitor se pudesse tornar um Virtuoso sem recorrer a um professor, de modo que as informações são apresentadas de um modo muito detalhado em todos os aspectos. Infelizmente, este tratado não é muito conhecido entre o público de língua espanhola, devido à falta de traduções, assim que um dos principais objectivos desta tese foi oferecer uma primeira versão para o público de língua espanhola. Para o enquadrar no contexto europeu, fez-se uma comparação com os métodos de flauta mais importantes da última década do século, tendo em conta vários aspectos fundamentais relacionados com a interpretação: som, respiração, articulação e ornamentação. A partir das concluções de este trabalho comparativo foram geradas as directrices que guiaram a interpretação do autor de um repertorio representantivo deste período histórico, disponibilizando-se num CD a gravação dos excertos que serviram de exemplo para aplicação dos resultados da investigação.