482 resultados para Long line
Resumo:
Alcohol dependence is a debilitating disorder with current therapies displaying limited efficacy and/or compliance. Consequently, there is a critical need for improved pharmacotherapeutic strategies to manage alcohol use disorders (AUDs). Previous studies have shown that the development of alcohol dependence involves repeated cycles of binge-like ethanol intake and abstinence. Therefore, we used a model of binge-ethanol consumption (drinking-in-the-dark) in mice to test the effects of compounds known to modify the activity of neurotransmitters implicated in alcohol addiction. From this, we have identified the FDA-approved antihypertensive drug pindolol, as a potential candidate for the management of AUDs. We show that the efficacy of pindolol to reduce ethanol consumption is enhanced following long-term (12-weeks) binge-ethanol intake, compared to short-term (4-weeks) intake. Furthermore, pindolol had no effect on locomotor activity or consumption of the natural reward sucrose. Because pindolol acts as a dual beta-adrenergic antagonist and 5-HT1A/1B partial agonist, we examined its effect on spontaneous synaptic activity in the basolateral amygdala (BLA), a brain region densely innervated by serotonin- and norepinephrine-containing fibres. Pindolol increased spontaneous excitatory post-synaptic current frequency in BLA principal neurons from long-term ethanol consuming mice but not naïve mice. Additionally, this effect was blocked by the 5-HT1A/1B receptor antagonist methiothepin, suggesting that altered serotonergic activity in the BLA may contribute to the efficacy of pindolol to reduce ethanol intake following long-term exposure. Although further mechanistic investigations are required, this study demonstrates the potential of pindolol as a new treatment option for AUDs that can be fast-tracked into human clinical studies.
Resumo:
- Background Nilotinib and dasatinib are now being considered as alternative treatments to imatinib as a first-line treatment of chronic myeloid leukaemia (CML). - Objective This technology assessment reviews the available evidence for the clinical effectiveness and cost-effectiveness of dasatinib, nilotinib and standard-dose imatinib for the first-line treatment of Philadelphia chromosome-positive CML. - Data sources Databases [including MEDLINE (Ovid), EMBASE, Current Controlled Trials, ClinicalTrials.gov, the US Food and Drug Administration website and the European Medicines Agency website] were searched from search end date of the last technology appraisal report on this topic in October 2002 to September 2011. - Review methods A systematic review of clinical effectiveness and cost-effectiveness studies; a review of surrogate relationships with survival; a review and critique of manufacturer submissions; and a model-based economic analysis. - Results Two clinical trials (dasatinib vs imatinib and nilotinib vs imatinib) were included in the effectiveness review. Survival was not significantly different for dasatinib or nilotinib compared with imatinib with the 24-month follow-up data available. The rates of complete cytogenetic response (CCyR) and major molecular response (MMR) were higher for patients receiving dasatinib than for those with imatinib for 12 months' follow-up (CCyR 83% vs 72%, p < 0.001; MMR 46% vs 28%, p < 0.0001). The rates of CCyR and MMR were higher for patients receiving nilotinib than for those receiving imatinib for 12 months' follow-up (CCyR 80% vs 65%, p < 0.001; MMR 44% vs 22%, p < 0.0001). An indirect comparison analysis showed no difference between dasatinib and nilotinib for CCyR or MMR rates for 12 months' follow-up (CCyR, odds ratio 1.09, 95% CI 0.61 to 1.92; MMR, odds ratio 1.28, 95% CI 0.77 to 2.16). There is observational association evidence from imatinib studies supporting the use of CCyR and MMR at 12 months as surrogates for overall all-cause survival and progression-free survival in patients with CML in chronic phase. In the cost-effectiveness modelling scenario, analyses were provided to reflect the extensive structural uncertainty and different approaches to estimating OS. First-line dasatinib is predicted to provide very poor value for money compared with first-line imatinib, with deterministic incremental cost-effectiveness ratios (ICERs) of between £256,000 and £450,000 per quality-adjusted life-year (QALY). Conversely, first-line nilotinib provided favourable ICERs at the willingness-to-pay threshold of £20,000-30,000 per QALY. - Limitations Immaturity of empirical trial data relative to life expectancy, forcing either reliance on surrogate relationships or cumulative survival/treatment duration assumptions. - Conclusions From the two trials available, dasatinib and nilotinib have a statistically significant advantage compared with imatinib as measured by MMR or CCyR. Taking into account the treatment pathways for patients with CML, i.e. assuming the use of second-line nilotinib, first-line nilotinib appears to be more cost-effective than first-line imatinib. Dasatinib was not cost-effective if decision thresholds of £20,000 per QALY or £30,000 per QALY were used, compared with imatinib and nilotinib. Uncertainty in the cost-effectiveness analysis would be substantially reduced with better and more UK-specific data on the incidence and cost of stem cell transplantation in patients with chronic CML. - Funding The Health Technology Assessment Programme of the National Institute for Health Research.
Resumo:
Background Hyperferritinemia-cataract syndrome (HCS) is a rare Mendelian condition characterized by bilateral cataract and high levels of serum ferritin in the absence of iron overload. Methods HCS was diagnosed in three adult siblings. In two of them it was possible to assess lens changes initially in 1995 and again in 2013. Serum ferritin, iron, transferrin concentrations and transferrin saturation percentage were also measured, and the Iron Responsive Element (IRE) region of the L-ferritin gene (FTL) was studied. Results Serum ferritin concentrations were considerably elevated while serum iron, transferrin and transferrin saturation levels were within the normal range in each sibling. Cataract changes in our patients were consistent with those previously reported in the literature. Progression of the cataract, an aspect of few studies in this syndrome, appeared to be quite limited in extent. The heterozygous +32G to T (-168G>T) substitution in the IRE of the FTL gene was detected in this family. Conclusions Ophthalmic and biochemical studies together with genetic testing confirmed HCS in three family members. Although the disorder has been extensively described in recent years, little is known regarding cataract evolution over time. In our cases, lens evaluations encompassed many years, identified bilateral cataract of typical morphology and supported the hypothesis that this unique clinical feature of the disease tends to be slowly progressive in nature, at least in adults.
Resumo:
This paper investigates the cointegration and causal relationships between Information and Communication Technology (ICT) and economic output in Australia using data for about five decades. The framework used in this paper is the single-sector aggregate production function, which is the first comprehensive approach of this kind to include ICT and non-ICT capital and other factors to examine long-run Granger causality. The empirical evidence points to a cointegration relationship between ICT capital and output, and implies that ICT capital Granger causes economic output and multifactor productivity, as does non-ICT capital.
Resumo:
This study investigated long-term use of custom-made orthopedic shoes (OS) at 1.5 years follow-up. In addition, the association between short-term outcomes and long-term use was studied. Patients from a previously published study who did use their first-ever pair of OS 3 months after delivery received another questionnaire after 1.5 years. Patients with different pathologies were included in the study (n = 269, response = 86%). Mean age was 63 ± 14 years, and 38% were male. After 1.5 years, 87% of the patients still used their OS (78% frequently [4-7 days/week] and 90% occasionally [1-3 days/week]) and 13% of the patients had ceased using their OS. Patients who were using their OS frequently after 1.5 years had significantly higher scores for 8 of 10 short-term usability outcomes (p-values ranged from <0.001 to 0.046). The largest differences between users and nonusers were found for scores on the short-term outcomes of OS fit and communication with the medical specialist and shoe technician (effect size range = 0.16 to 0.46). We conclude that patients with worse short-term usability outcomes for their OS are more likely to use their OS only occasionally or not at all at long-term follow-up.
Resumo:
Objective To determine mortality rates after a first lower limb amputation and explore the rates for different subpopulations. Methods Retrospective cohort study of all people who underwent a first amputation at or proximal to transtibial level, in an area of 1.7 million people. Analysis with Kaplan-Meier curves and Log Rank tests for univariate associations of psycho-social and health variables. Logistic regression for odds of death at 30-days, 1-year and 5-years. Results 299 people were included. Median time to death was 20.3 months (95%CI: 13.1; 27.5). 30-day mortality = 22%; odds of death 2.3 times higher in people with history of cerebrovascular disease (95%CI: 1.2; 4.7, P = 0.016). 1 year mortality = 44%; odds of death 3.5 times higher for people with renal disease (95%CI: 1.8; 7.0, P < 0.001). 5-years mortality = 77%; odds of death 5.4 times higher for people with renal disease (95%CI: 1.8; 16.0,P = 0.003). Variation in mortality rates was most apparent in different age groups; people 75–84 years having better short term outcomes than those younger and older. Conclusions Mortality rates demonstrated the frailty of this population, with almost one quarter of people dying within 30-days, and almost half at 1 year. People with cerebrovascular had higher odds of death at 30 days, and those with renal disease and 1 and 5 years, respectively.
Resumo:
Background Around the world, guidelines and clinical practice for the prevention of complications associated with central venous catheters (CVC) vary greatly. To prevent occlusion, most institutions recommend the use of heparin when the CVC is not in use. However, there is debate regarding the need for heparin and evidence to suggest normal saline may be as effective. The use of heparin is not without risk, may be unnecessary and is also associated with increased costs. Objectives To assess the clinical effects (benefits and harms) of heparin versus normal saline to prevent occlusion in long-term central venous catheters in infants, children and adolescents. Design A Cochrane systematic review of randomised controlled trials was undertaken. - Data sources: The Cochrane Vascular Group Specialised Register (including MEDLINE, CINAHL, EMBASE and AMED) and the Cochrane Register of Studies were searched. Hand searching of relevant journals and reference lists of retrieved articles was also undertaken. - Review Methods: Data were extracted and appraisal undertaken. We included studies that compared the efficacy of normal saline with heparin to prevent occlusion. We excluded temporary CVCs and peripherally inserted central catheters. Rate ratios per 1000 catheter days were calculated for two outcomes, occlusion of the CVC, and CVC-associated blood stream infection. Results Three trials with a total of 245 participants were included in this review. The three trials directly compared the use of normal saline and heparin. However, between studies, all used different protocols with various concentrations of heparin and frequency of flushes. The quality of the evidence ranged from low to very low. The estimated rate ratio for CVC occlusion per 1000 catheter days between the normal saline and heparin group was 0.75 (95% CI 0.10 to 5.51, two studies, 229 participants, very low quality evidence). The estimated rate ratio for CVC-associated blood stream infection was 1.48 (95% CI 0.24 to 9.37, two studies, 231 participants; low quality evidence). Conclusions It remains unclear whether heparin is necessary for CVC maintenance. More well-designed studies are required to understand this relatively simple, but clinically important question. Ultimately, if this evidence were available, the development of evidenced-based clinical practice guidelines and consistency of practice would be facilitated.
Resumo:
The BeiDou system is the first global navigation satellite system in which all satellites transmit triple-frequency signals that can provide the positioning, navigation, and timing independently. A benefit of triple-frequency signals is that more useful combinations can be formed, including some extrawide-lane combinations whose ambiguities can generally be instantaneously fixed without distance restriction, although the narrow-lane ambiguity resolution (NL AR) still depends on the interreceiver distance or requires a long time to achieve. In this paper, we synthetically study decimeter and centimeter kinematic positioning using BeiDou triple-frequency signals. It starts with AR of two extrawide-lane signals based on the ionosphere-free or ionosphere-reduced geometry-free model. For decimeter positioning, one can immediately use two ambiguity-fixed extrawide-lane observations without pursuing NL AR. To achieve higher accuracy, NL AR is the necessary next step. Despite the fact that long-baseline NL AR is still challenging, some NL ambiguities can indeed be fixed with high reliability. Partial AR for NL signals is acceptable, because as long as some ambiguities for NL signals are fixed, positioning accuracy will be certainly improved.With accumulation of observations, more and more NL ambiguities are fixed and the positioning accuracy continues to improve. An efficient Kalman-filtering system is established to implement the whole process. The formulated system is flexible, since the additional constraints can be easily applied to enhance the model's strength. Numerical results from a set of real triple-frequency BeiDou data on a 50 km baseline show that decimeter positioning is achievable instantaneously.With only five data epochs, 84% of NL ambiguities can be fixed so that the real-time kinematic accuracies are 4.5, 2.5, and 16 cm for north, east, and height components (respectively), while with 10 data epochs more than 90% of NL ambiguities are fixed, and the rea- -time kinematic solutions are improved to centimeter level for all three coordinate components.
Resumo:
Carrier phase ambiguity resolution over long baselines is challenging in BDS data processing. This is partially due to the variations of the hardware biases in BDS code signals and its dependence on elevation angles. We present an assessment of satellite-induced code bias variations in BDS triple-frequency signals and the ambiguity resolutions procedures involving both geometry-free and geometry-based models. First, since the elevation of a GEO satellite remains unchanged, we propose to model the single-differenced fractional cycle bias with widespread ground stations. Second, the effects of code bias variations induced by GEO, IGSO and MEO satellites on ambiguity resolution of extra-wide-lane, wide-lane and narrow-lane combinations are analyzed. Third, together with the IGSO and MEO code bias variations models, the effects of code bias variations on ambiguity resolution are examined using 30-day data collected over the baselines ranging from 500 to 2600 km in 2014. The results suggest that although the effect of code bias variations on the extra-wide-lane integer solution is almost ignorable due to its long wavelength, the wide-lane integer solutions are rather sensitive to the code bias variations. Wide-lane ambiguity resolution success rates are evidently improved when code bias variations are corrected. However, the improvement of narrow-lane ambiguity resolution is not obvious since it is based on geometry-based model and there is only an indirect impact on the narrow-lane ambiguity solutions.
Resumo:
Globally, lung cancer accounts for approximately 20% of all cancer related deaths. Five-year survival is poor and rates have remained unchanged for the past four decades. There is an urgent need to identify markers of lung carcinogenesis and new targets for therapy. Given the recent successes of immune modulators in cancer therapy and the improved understanding of immune evasion by tumours, we sought to determine the carcinogenic impact of chronic TNF-α and IL-1β exposure in a normal bronchial epithelial cell line model. Following three months of culture in a chronic inflammatory environment under conditions of normoxia and hypoxia (0.5% oxygen), normal cells developed a number of key genotypic and phenotypic alterations. Important cellular features such as the proliferative, adhesive and invasive capacity of the normal cells were significantly amplified. In addition, gene expression profiles were altered in pathways associated with apoptosis, angiogenesis and invasion. The data generated in this study provides support that TNF-α, IL-1β and hypoxia promotes a neoplastic phenotype in normal bronchial epithelial cells. In turn these mediators may be of benefit for biomarker and/or immune-therapy target studies. This project provides an important inflammatory in vitro model for further immuno-oncology studies in the lung cancer setting.
Resumo:
Evidence-based policy is a means of ensuring that policy is informed by more than ideology or expedience. However, what constitutes robust evidence is highly contested. In this paper, we argue policy must draw on quantitative and qualitative data. We do this in relation to a long entrenched problem in Australian early childhood education and care (ECEC) workforce policy. A critical shortage of qualified staff threatens the attainment of broader child and family policy objectives linked to the provision of ECEC and has not been successfully addressed by initiatives to date. We establish some of the limitations of existing quantitative data sets and consider the potential of qualitative studies to inform ECEC workforce policy. The adoption of both quantitative and qualitative methods is needed to illuminate the complex nature of the work undertaken by early childhood educators, as well as the environmental factors that sustain job satisfaction in a demanding and poorly understood working environment.
Resumo:
Caveolae have been linked to diverse cellular functions and to many disease states. In this study we have used zebrafish to examine the role of caveolin-1 and caveolae during early embryonic development. During development, expression is apparent in a number of tissues including Kupffer's vesicle, tailbud, intersomite boundaries, heart, branchial arches, pronephric ducts and periderm. Particularly strong expression is observed in the sensory organs of the lateral line, the neuromasts and in the notochord where it overlaps with expression of caveolin-3. Morpholino-mediated downregulation of Cav1α caused a dramatic inhibition of neuromast formation. Detailed ultrastructural analysis, including electron tomography of the notochord, revealed that the central regions of the notochord has the highest density of caveolae of any embryonic tissue comparable to the highest density observed in any vertebrate tissue. In addition, Cav1α downregulation caused disruption of the notochord, an effect that was enhanced further by Cav3 knockdown. These results indicate an essential role for caveolin and caveolae in this vital structural and signalling component of the embryo.
Resumo:
Building on the launch of an early prototype at Balance Unbalance 2013, we now offer a fully realised experience of the ‘Long Time, No See?’ site specific walking/visualisation project for conference users to engage with on a do it yourself basis, either before, during or after the event. ‘Long Time, No See?’ is a new form of participatory, environmental futures project, designed for individuals and groups. It uses a smartphone APP to guide processes of individual or group walking at any chosen location—encouraging walkers to think in radical new ways about how to best prepare for ‘stormy’ environmental futures ahead. As part of their personal journeys participants’ contribute site-specific micro narratives in the form of texts, images and sounds, captured via the APP during the loosely ‘guided’ walk. These responses are then uploaded and synthesised into an ever-building audiovisual and generative artwork/‘map’ of future-thinking affinities, viewable both online at long-time-no-see.org (in Chrome) (and at the same time on a large screen visualisations at QUT’s Cube Centre in Brisbane Australia). The artwork therefore spans both participants’ mobile devices and laptops. If desired outcomes can also be presented publicly in large screen format at the conference. ‘Long Time, No See?’ has been developed over the past two years by a team of leading Australian artists, designers, urban/environmental planners and programmers.
Resumo:
Background The irreversible ErbB family blocker afatinib and the reversible EGFR tyrosine kinase inhibitor gefitinib are approved for first-line treatment of EGFR mutation-positive non-small-cell lung cancer (NSCLC). We aimed to compare the efficacy and safety of afatinib and gefitinib in this setting. Methods This multicentre, international, open-label, exploratory, randomised controlled phase 2B trial (LUX-Lung 7) was done at 64 centres in 13 countries. Treatment-naive patients with stage IIIB or IV NSCLC and a common EGFR mutation (exon 19 deletion or Leu858Arg) were randomly assigned (1:1) to receive afatinib (40 mg per day) or gefitinib (250 mg per day) until disease progression, or beyond if deemed beneficial by the investigator. Randomisation, stratified by EGFR mutation type and status of brain metastases, was done centrally using a validated number generating system implemented via an interactive voice or web-based response system with a block size of four. Clinicians and patients were not masked to treatment allocation; independent review of tumour response was done in a blinded manner. Coprimary endpoints were progression-free survival by independent central review, time-to-treatment failure, and overall survival. Efficacy analyses were done in the intention-to-treat population and safety analyses were done in patients who received at least one dose of study drug. This ongoing study is registered with ClinicalTrials.gov, number NCT01466660. Findings Between Dec 13, 2011, and Aug 8, 2013, 319 patients were randomly assigned (160 to afatinib and 159 to gefitinib). Median follow-up was 27·3 months (IQR 15·3–33·9). Progression-free survival (median 11·0 months [95% CI 10·6–12·9] with afatinib vs 10·9 months [9·1–11·5] with gefitinib; hazard ratio [HR] 0·73 [95% CI 0·57–0·95], p=0·017) and time-to-treatment failure (median 13·7 months [95% CI 11·9–15·0] with afatinib vs 11·5 months [10·1–13·1] with gefitinib; HR 0·73 [95% CI 0·58–0·92], p=0·0073) were significantly longer with afatinib than with gefitinib. Overall survival data are not mature. The most common treatment-related grade 3 or 4 adverse events were diarrhoea (20 [13%] of 160 patients given afatinib vs two [1%] of 159 given gefitinib) and rash or acne (15 [9%] patients given afatinib vs five [3%] of those given gefitinib) and liver enzyme elevations (no patients given afatinib vs 14 [9%] of those given gefitinib). Serious treatment-related adverse events occurred in 17 (11%) patients in the afatinib group and seven (4%) in the gefitinib group. Ten (6%) patients in each group discontinued treatment due to drug-related adverse events. 15 (9%) fatal adverse events occurred in the afatinib group and ten (6%) in the gefitinib group. All but one of these deaths were considered unrelated to treatment; one patient in the gefitinib group died from drug-related hepatic and renal failure. Interpretation Afatinib significantly improved outcomes in treatment-naive patients with EGFR-mutated NSCLC compared with gefitinib, with a manageable tolerability profile. These data are potentially important for clinical decision making in this patient population.