995 resultados para Landmark analysis
Resumo:
Locality to other nodes on a peer-to-peer overlay network can be established by means of a set of landmarks shared among the participating nodes. Each node independently collects a set of latency measures to landmark nodes, which are used as a multi-dimensional feature vector. Each peer node uses the feature vector to generate a unique scalar index which is correlated to its topological locality. A popular dimensionality reduction technique is the space filling Hilbert’s curve, as it possesses good locality preserving properties. However, there exists little comparison between Hilbert’s curve and other techniques for dimensionality reduction. This work carries out a quantitative analysis of their properties. Linear and non-linear techniques for scaling the landmark vectors to a single dimension are investigated. Hilbert’s curve, Sammon’s mapping and Principal Component Analysis have been used to generate a 1d space with locality preserving properties. This work provides empirical evidence to support the use of Hilbert’s curve in the context of locality preservation when generating peer identifiers by means of landmark vector analysis. A comparative analysis is carried out with an artificial 2d network model and with a realistic network topology model with a typical power-law distribution of node connectivity in the Internet. Nearest neighbour analysis confirms Hilbert’s curve to be very effective in both artificial and realistic network topologies. Nevertheless, the results in the realistic network model show that there is scope for improvements and better techniques to preserve locality information are required.
Resumo:
Background: There is currently no early predictive marker of survival for patients receiving chemotherapy for malignant pleural mesothelioma (MPM). Tumour response may be predictive for overall survival (OS), though this has not been explored. We have thus undertaken a combined-analysis of OS, from a 42 day landmark, of 526 patients receiving systemic therapy for MPM. We also validate published progression-free survival rates (PFSRs) and a progression-free survival (PFS) prognostic-index model. Methods: Analyses included nine MPM clinical trials incorporating six European Organisation for Research and Treatment of Cancer (EORTC) studies. Analysis of OS from landmark (from day 42 post-treatment) was considered regarding tumour response. PFSR analysis data included six non-EORTC MPM clinical trials. Prognostic index validation was performed on one non-EORTC data-set, with available survival data. Results: Median OS, from landmark, of patients with partial response (PR) was 12·8 months, stable disease (SD), 9·4 months and progressive disease (PD), 3·4 months. Both PR and SD were associated with longer OS from landmark compared with disease progression (both p < 0·0001). PFSRs for platinum-based combination therapies were consistent with published significant clinical activity ranges. Effective separation between PFS and OS curves provided a validation of the EORTC prognostic model, based on histology, stage and performance status. Conclusion: Response to chemotherapy is associated with significantly longer OS from landmark in patients with MPM. © 2012 Elsevier Ltd. All rights reserved.
Resumo:
Background: The randomised phase 3 First-Line Erbitux in Lung Cancer (FLEX) study showed that the addition of cetuximab to cisplatin and vinorelbine significantly improved overall survival compared with chemotherapy alone in the first-line treatment of advanced non-small-cell lung cancer (NSCLC). The main cetuximab-related side-effect was acne-like rash. Here, we assessed the association of this acne-like rash with clinical benefit. Methods: We did a subgroup analysis of patients in the FLEX study, which enrolled patients with advanced NSCLC whose tumours expressed epidermal growth factor receptor. Our landmark analysis assessed if the development of acne-like rash in the first 21 days of treatment (first-cycle rash) was associated with clinical outcome, on the basis of patients in the intention-to-treat population alive on day 21. The FLEX study is registered with ClinicalTrials.gov, number NCT00148798. Findings: 518 patients in the chemotherapy plus cetuximab group-290 of whom had first-cycle rash-and 540 patients in the chemotherapy alone group were alive on day 21. Patients in the chemotherapy plus cetuximab group with first-cycle rash had significantly prolonged overall survival compared with patients in the same treatment group without first-cycle rash (median 15·0 months [95% CI 12·8-16·4] vs 8·8 months [7·6-11·1]; hazard ratio [HR] 0·631 [0·515-0·774]; p<0·0001). Corresponding significant associations were also noted for progression-free survival (median 5·4 months [5·2-5·7] vs 4·3 months [4·1-5·3]; HR 0·741 [0·607-0·905]; p=0·0031) and response (rate 44·8% [39·0-50·8] vs 32·0% [26·0-38·5]; odds ratio 1·703 [1·186-2·448]; p=0·0039). Overall survival for patients without first-cycle rash was similar to that of patients that received chemotherapy alone (median 8·8 months [7·6-11·1] vs 10·3 months [9·6-11·3]; HR 1·085 [0·910-1·293]; p=0·36). The significant overall survival benefit for patients with first-cycle rash versus without was seen in all histology subgroups: adenocarcinoma (median 16·9 months, [14·1-20·6] vs 9·3 months [7·7-13·2]; HR 0·614 [0·453-0·832]; p=0·0015), squamous-cell carcinoma (median 13·2 months [10·6-16·0] vs 8·1 months [6·7-12·6]; HR 0·659 [0·472-0·921]; p=0·014), and carcinomas of other histology (median 12·6 months [9·2-16·4] vs 6·9 months [5·2-11·0]; HR 0·616 [0·392-0·966]; p=0·033). Interpretation: First-cycle rash was associated with a better outcome in patients with advanced NSCLC who received cisplatin and vinorelbine plus cetuximab as a first-line treatment. First-cycle rash might be a surrogate clinical marker that could be used to tailor cetuximab treatment for advanced NSCLC to those patients who would be most likely to derive a significant benefit. Funding: Merck KGaA. © 2011 Elsevier Ltd.
Resumo:
BACKGROUND The safety and efficacy of new-generation drug-eluting stents (DES) in women with multiple atherothrombotic risk (ATR) factors is unclear. METHODS AND RESULTS We pooled patient-level data for women enrolled in 26 randomized trials. Study population was categorized based on the presence or absence of high ATR, which was defined as having history of diabetes mellitus, prior percutaneous or surgical coronary revascularization, or prior myocardial infarction. The primary end point was major adverse cardiovascular events defined as a composite of all-cause mortality, myocardial infarction, or target lesion revascularization at 3 years of follow-up. Out of 10 449 women included in the pooled database, 5333 (51%) were at high ATR. Compared with women not at high ATR, those at high ATR had significantly higher risk of major adverse cardiovascular events (15.8% versus 10.6%; adjusted hazard ratio: 1.53; 95% confidence interval: 1.34-1.75; P=0.006) and all-cause mortality. In high-ATR risk women, the use of new-generation DES was associated with significantly lower risk of 3-year major adverse cardiovascular events (adjusted hazard ratio: 0.69; 95% confidence interval: 0.52-0.92) compared with early-generation DES. The benefit of new-generation DES on major adverse cardiovascular events was uniform between high-ATR and non-high-ATR women, without evidence of interaction (Pinteraction=0.14). At landmark analysis, in high-ATR women, stent thrombosis rates were comparable between DES generations in the first year, whereas between 1 and 3 years, stent thrombosis risk was lower with new-generation devices. CONCLUSIONS Use of new-generation DES even in women at high ATR is associated with a benefit consistent over 3 years of follow-up and a substantial improvement in very-late thrombotic safety.
Resumo:
Chronic graft-versus-host disease (cGVHD), a common complication after stem cell transplant (SCT), has an impact on morbidity and survival. Previous classification of cGVHD has not been reproducible or prognostic for nonrelapse mortality (NRM). Recently the National Institutes of Health (NIH) consensus criteria were proposed, but the ability of this classification to predict outcome of various subtypes of cGVHD is unknown. Patients (N = 110) undergoing an SCT for a hematologic malignancy and surviving until day 100 posttransplant from 2001 to 2003 were studied. The overall survival (OS) using a landmark analysis at day 100 was 44% versus 66% (no GVHD vs. GVHD, P = .026). The OS of patients with various types of GVHD as proposed by the NIH criteria were significantly different (P < .0001). In a univariate analyses, this was more apparent when patients with any acute features of GVHD were compared to classic cGVHD (3-year OS 46% vs. 68%, P = .033). The 3-year NRM for the entire cohort was 21%, and was not affected by presence or absence of GVHD or subtypes of GVHD. In a multivariable analysis, extensive cGVHD (hazard ratio [HR] 0.35, P = .015) and having any acute feature of GVHD after day 100 (HR 3.36, P = .0144) were significant independent predictors of survival. The OS with different NIH subtypes of GVHD after day 100 from SCT varies, and is superior for patients with classic cGVHD.
Resumo:
BACKGROUND Trastuzumab has established efficacy against breast cancer with overexpression or amplification of the HER2 oncogene. The standard of care is 1 year of adjuvant trastuzumab, but the optimum duration of treatment is unknown. We compared 2 years of treatment with trastuzumab with 1 year of treatment, and updated the comparison of 1 year of trastuzumab versus observation at a median follow-up of 8 years, for patients enrolled in the HERceptin Adjuvant (HERA) trial. METHODS The HERA trial is an international, multicentre, randomised, open-label, phase 3 trial comparing treatment with trastuzumab for 1 and 2 years with observation after standard neoadjuvant chemotherapy, adjuvant chemotherapy, or both in 5102 patients with HER2-positive early breast cancer. The primary endpoint was disease-free survival. The comparison of 2 years versus 1 year of trastuzumab treatment involved a landmark analysis of 3105 patients who were disease-free 12 months after randomisation to one of the trastuzumab groups, and was planned after observing at least 725 disease-free survival events. The updated intention-to-treat comparison of 1 year trastuzumab treatment versus observation alone in 3399 patients at a median follow-up of 8 years (range 0-10) is also reported. This study is registered with ClinicalTrials.gov, number NCT00045032. FINDINGS We recorded 367 events of disease-free survival in 1552 patients in the 1 year group and 367 events in 1553 patients in the 2 year group (hazard ratio [HR] 0·99, 95% CI 0·85-1·14, p=0·86). Grade 3-4 adverse events and decreases in left ventricular ejection fraction during treatment were reported more frequently in the 2 year treatment group than in the 1 year group (342 [20·4%] vs 275 [16·3%] grade 3-4 adverse events, and 120 [7·2%] vs 69 [4·1%] decreases in left ventricular ejection fraction, respectively). HRs for a comparison of 1 year of trastuzumab treatment versus observation were 0·76 (95% CI 0·67-0·86, p<0·0001) for disease-free survival and 0·76 (0·65-0·88, p=0·0005) for overall survival, despite crossover of 884 (52%) patients from the observation group to trastuzumab therapy. INTERPRETATION 2 years of adjuvant trastuzumab is not more effective than is 1 year of treatment for patients with HER2-positive early breast cancer. 1 year of treatment provides a significant disease-free and overall survival benefit compared with observation and remains the standard of care. FUNDING F Hoffmann-La Roche (Roche).
Resumo:
Background We hypothesized that in patients with stable coronary artery disease and stenosis, percutaneous coronary intervention (PCI) performed on the basis of the fractional flow reserve (FFR) would be superior to medical therapy. Methods In 1220 patients with stable coronary artery disease, we assessed the FFR in all stenoses that were visible on angiography. Patients who had at least one stenosis with an FFR of 0.80 or less were randomly assigned to undergo FFR-guided PCI plus medical therapy or to receive medical therapy alone. Patients in whom all stenoses had an FFR of more than 0.80 received medical therapy alone and were included in a registry. The primary end point was a composite of death from any cause, nonfatal myocardial infarction, or urgent revascularization within 2 years. Results The rate of the primary end point was significantly lower in the PCI group than in the medical-therapy group (8.1% vs. 19.5%; hazard ratio, 0.39; 95% confidence interval [CI], 0.26 to 0.57; P<0.001). This reduction was driven by a lower rate of urgent revascularization in the PCI group (4.0% vs. 16.3%; hazard ratio, 0.23; 95% CI, 0.14 to 0.38; P<0.001), with no significant between-group differences in the rates of death and myocardial infarction. Urgent revascularizations that were triggered by myocardial infarction or ischemic changes on electrocardiography were less frequent in the PCI group (3.4% vs. 7.0%, P=0.01). In a landmark analysis, the rate of death or myocardial infection from 8 days to 2 years was lower in the PCI group than in the medical-therapy group (4.6% vs. 8.0%, P=0.04). Among registry patients, the rate of the primary end point was 9.0% at 2 years. Conclusions In patients with stable coronary artery disease, FFR-guided PCI, as compared with medical therapy alone, improved the outcome. Patients without ischemia had a favorable outcome with medical therapy alone. (Funded by St. Jude Medical; FAME 2 ClinicalTrials.gov number, NCT01132495 .).
Resumo:
AIMS To investigate the outcomes of percutaneous coronary intervention (PCI) in bifurcation versus non-bifurcation lesions using the next-generation Resolute zotarolimus-eluting stent (R-ZES). METHODS AND RESULTS We analyzed 3-year pooled data from the RESOLUTE All-Comers trial and the RESOLUTE International registry. The R-ZES was used in 2772 non-bifurcation lesion patients and 703 bifurcation lesion patients, of which 482 were treated with a simple-stent technique (1 stent used to treat the bifurcation lesion) and 221 with a complex bifurcation technique (2 or more stents used). The primary endpoint was 3-year target lesion failure (TLF, defined as the composite of death from cardiac causes, target vessel myocardial infarction, or clinically-indicated target lesion revascularization [TLR]), and was 13.3% in bifurcation vs 11.3% in non-bifurcation lesion patients (adjusted P=.06). Landmark analysis revealed that this difference was driven by differences in the first 30 days between bifurcation vs non-bifurcation lesions (TLF, 6.6% vs 2.7%, respectively; adjusted P<.001), which included significant differences in each component of TLF and in-stent thrombosis. Between 31 days and 3 years, TLF, its components, and stent thrombosis did not differ significantly between bifurcation lesions and non-bifurcation lesions (TLF, 7.7% vs 9.0%, respectively; adjusted P=.50). CONCLUSION The 3-year risk of TLF following PCI with R-ZES in bifurcation lesions was not significantly different from non-bifurcation lesions. However, there was an increased risk associated with bifurcation lesions during the first 30 days; beyond 30 days, bifurcation lesions and non-bifurcation lesions yielded similar 3-year outcomes.
Resumo:
Morphological differences among 6 species of marine fishes belonging to 2 subfamilies of the family Serranidae (Serraninae: Dules auriga, Diplectrum formosum, and D, radiale; Epinephelinae: Epinephelus marginatus, Mycteroperca acutirostris, and M. bonaci) were studied by the geometric morphometric method of thin-plate splines and multivariate analysis of partial-warp scores. The decomposition of shape variation into uniform and nonaffine components of shape change indicate that major differences among species are related to both components of shape variation. Significant differences were found among species with respect to the uniform components, but there is no clear separation of taxonomic groups related to these components, and species are instead separated on the basis of body height and caudal peduncle length. Non-uniform changes in body shape, in turn, clearly differentiate the species of Serraninae and Epinephelinae. These shape changes are probably related to differences in habitat and feeding habits among the species.
Resumo:
Two objects with homologous landmarks are said to be of the same shape if the configurations of landmarks of one object can be exactly matched with that of the other by translation, rotation/reflection, and scaling. The observations on an object are coordinates of its landmarks with reference to a set of orthogonal coordinate axes in an appropriate dimensional space. The origin, choice of units, and orientation of the coordinate axes with respect to an object may be different from object to object. In such a case, how do we quantify the shape of an object, find the mean and variation of shape in a population of objects, compare the mean shapes in two or more different populations, and discriminate between objects belonging to two or more different shape distributions. We develop some methods that are invariant to translation, rotation, and scaling of the observations on each object and thereby provide generalizations of multivariate methods for shape analysis.
Resumo:
As a strategy to identify child sexual abuse, most Australian States and Territories have enacted legislation requiring teachers to report suspected cases. Some Australian State and non-State educational authorities have also created policy-based obligations to report suspected child sexual abuse. Significantly, these can be wider than non-existent or limited legislative duties, and therefore are a crucial element of the effort to identify sexual abuse. Yet, no research has explored the existence and nature of these policy-based duties. The first purpose of this paper is to report the results of a three-State study into policy-based reporting duties in State and non-State schools in Australia. In an extraordinary coincidence, while conducting the study, a case of failure to comply with reporting policy occurred with tragic consequences. This led to a rare example in Australia (and one of only a few worldwide) of a professional being prosecuted for failure to comply with a legislative duty. It also led to disciplinary proceedings against school staff. The second purpose of this paper is to describe this case and connect it with findings from our policy analysis.
Resumo:
Background: For those in the field of managing diabetic complications, the accurate diagnosis and monitoring of diabetic peripheral neuropathy (DPN) continues to be a challenge. Assessment of sub-basal corneal nerve morphology has recently shown promise as a novel ophthalmic marker for the detection of DPN. Methods: Two hundred and thirty-one individuals with diabetes with predominantly mild or no neuropathy and 61 controls underwent evaluation of diabetic neuropathy symptom score, neuropathy disability score, testing with 10 g monofilament, quantitative sensory testing (warm, cold, vibration detection) and nerve conduction studies. Corneal nerve fibre length, branch density and tortuosity were measured using corneal confocal microscopy. Differences in corneal nerve morphology between individuals with and without DPN and controls were investigated using analysis of variance and correlations were determined between corneal morphology and established tests of, and risk factors for, DPN. Results: Corneal nerve fibre length was significantly reduced in diabetic individuals with mild DPN compared with both controls (p < 0.001) and diabetic individuals without DPN (p = 0.012). Corneal nerve branch density was significantly reduced in individuals with mild DPN compared with controls (p = 0.032). Corneal nerve fibre tortuosity did not show significant differences. Corneal nerve fibre length and corneal nerve branch density showed modest correlations to most measures of neuropathy, with the strongest correlations to nerve conduction study parameters (r = 0.15 to 0.25). Corneal nerve fibre tortuosity showed only a weak correlation to the vibration detection threshold. Corneal nerve fibre length was inversely correlated to glycated haemoglobin (r = -0.24) and duration of diabetes (r = -0.20). Conclusion: Assessment of corneal nerve morphology is a non-invasive, rapid test capable of showing differences between individuals with and without DPN. Corneal nerve fibre length shows the strongest associations with other diagnostic tests of neuropathy and with established risk factors for neuropathy.
Resumo:
Purpose Over the past decade, corneal nerve morphology and corneal sensation threshold have been explored as potential surrogate markers for the evaluation of diabetic neuropathy. We present the baseline findings of a Longitudinal Assessment of Neuropathy in Diabetes using novel ophthalmic Markers (LANDMark). Methods The LANDMark Study is a 5-year, two-site, natural history (observational) study of individuals with Type 1 diabetes stratified into those with (T1W) and without (T1WO) neuropathy according to the Toronto criteria, and control subjects. All study participants undergo detailed annual assessment of neuropathy including corneal nerve parameters measured using corneal confocal microscopy and corneal sensitivity measured using non-contact corneal esthesiometry. Results 396 eligible individuals (208 in Brisbane and 188 in Manchester) were assessed: 76 T1W, 166 T1WO and 154 controls. Corneal sensation threshold (mbars) was significantly higher in T1W (1.0 ± 1.1) than T1WO (0.7 ± 0.7) and controls (0.6 ± 0.4) (P=0.002); post-hoc analysis (PHA) revealed no difference between T1WO and controls (Tukey HSD, P=0.502). Corneal nerve fiber length (mm/mm2) (CNFL) was lower in T1W (13.8 ± 6.4) than T1WO (19.1 ± 5.8) and controls (23.2 ± 6.3) (P<0.001); PHA revealed CNFL to be lower in T1W than T1WO, and lower in both of these groups than controls (P<0.001). Corneal nerve branch density (branches/mm2) (CNBD) was significantly lower in T1W (40 ± 32) than T1WO (62 ± 37) and controls (83 ± 46) (P<0.001); PHA showed CNBD was lower in T1W than T1WO, and lower in both groups than controls (P<0.001). Alcohol and cigarette consumption did not differ between groups, although age, BMI, BP, waist circumference, HbA1c, albumin-creatinine ratio, and cholesterol were slightly greater in T1W than T1WO (p<0.05). Some site differences were observed. Conclusions The LANDMark baseline findings confirm that corneal sensitivity and corneal nerve morphometry can detect differences in neuropathy status in individuals with Type 1 diabetes and healthy controls. Corneal nerve morphology is significantly abnormal even in diabetic patients ‘without neuropathy’ compared to control participants. Results of the longitudinal trial will assess the capability of these tests for monitoring change in these parameters over time as potential surrogate markers for neuropathy.
Resumo:
Purpose:Over the past decade, corneal nerve morphology and corneal sensation threshold have been explored as potential surrogate markers for the evaluation of diabetic neuropathy. We present the baseline findings of a Longitudinal Assessment of Neuropathy in Diabetes using novel ophthalmic Markers (LANDMark). Methods:The LANDMark Study is a 5-year, two-site, natural history (observational) study of individuals with Type 1 diabetes stratified into those with (T1W) and without (T1WO) neuropathy according to the Toronto criteria, and control subjects. All study participants undergo detailed annual assessment of neuropathy including corneal nerve parameters measured using corneal confocal microscopy and corneal sensitivity measured using non-contact corneal esthesiometry. Results:396 eligible individuals (208 in Brisbane and 188 in Manchester) were assessed: 76 T1W, 166 T1WO and 154 controls. Corneal sensation threshold (mbars) was significantly higher in T1W (1.0 ± 1.1) than T1WO (0.7 ± 0.7) and controls (0.6 ± 0.4) (P=0.002); post-hoc analysis (PHA) revealed no difference between T1WO and controls (Tukey HSD, P=0.502). Corneal nerve fiber length (mm/mm2) (CNFL) was lower in T1W (13.8 ± 6.4) than T1WO (19.1 ± 5.8) and controls (23.2 ± 6.3) (P<0.001); PHA revealed CNFL to be lower in T1W than T1WO, and lower in both of these groups than controls (P<0.001). Corneal nerve branch density (branches/mm2) (CNBD) was significantly lower in T1W (40 ± 32) than T1WO (62 ± 37) and controls (83 ± 46) (P<0.001); PHA showed CNBD was lower in T1W than T1WO, and lower in both groups than controls (P<0.001). Alcohol and cigarette consumption did not differ between groups, although age, BMI, BP, waist circumference, HbA1c, albumin-creatinine ratio, and cholesterol were slightly greater in T1W than T1WO (p<0.05). Some site differences were observed. Conclusions:The LANDMark baseline findings confirm that corneal sensitivity and corneal nerve morphometry can detect differences in neuropathy status in individuals with Type 1 diabetes and healthy controls. Corneal nerve morphology is significantly abnormal even in diabetic patients ‘without neuropathy’ compared to control participants. Results of the longitudinal trial will assess the capability of these tests for monitoring change in these parameters over time as potential surrogate markers for neuropathy.