859 resultados para Delay of Gratification


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective To evaluate the performance of China’s infectious disease automated alert and response system in the detection of outbreaks of hand, foot and mouth (HFM) disease. Methods We estimated size, duration and delay in reporting HFM disease outbreaks from cases notified between 1 May 2008 and 30 April 2010 and between 1 May 2010 and 30 April 2012, before and after automatic alert and response included HFM disease. Sensitivity, specificity and timeliness of detection of aberrations in the incidence of HFM disease outbreaks were estimated by comparing automated detections to observations of public health staff. Findings The alert and response system recorded 106 005 aberrations in the incidence of HFM disease between 1 May 2010 and 30 April 2012 – a mean of 5.6 aberrations per 100 days in each county that reported HFM disease. The response system had a sensitivity of 92.7% and a specificity of 95.0%. The mean delay between the reporting of the first case of an outbreak and detection of that outbreak by the response system was 2.1 days. Between the first and second study periods, the mean size of an HFM disease outbreak decreased from 19.4 to 15.8 cases and the mean interval between the onset and initial reporting of such an outbreak to the public health emergency reporting system decreased from 10.0 to 9.1 days. Conclusion The automated alert and response system shows good sensitivity in the detection of HFM disease outbreaks and appears to be relatively rapid. Continued use of this system should allow more effective prevention and limitation of such outbreaks in China.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Multipotent mesenchymal stromal cells suppress T-cell function in vitro, a property that has underpinned their use in treating clinical steroid-refractory graft-versus-host disease after allogeneic hematopoietic stem cell transplantation. However the potential of mesenchymal stromal cells to resolve graft-versus-host disease is confounded by a paucity of pre-clinical data delineating their immunomodulatory effects in vivo. Design and Methods: We examined the influence of timing and dose of donor-derived mesenchymal stromal cells on the kinetics of graft-versus-host disease in two murine models of graft-versus-host disease (major histocompatibility complex-mismatched: UBI-GFP/BL6 [H-2b]→BALB/c [H-2d] and the sibling transplant mimic, UBI-GFP/BL6 [H-2b]→BALB.B [H-2b]) using clinically relevant conditioning regimens. We also examined the effect of mesenchymal stromal cell infusion on bone marrow and spleen cellular composition and cytokine secretion in transplant recipients. Results: Despite T-cell suppression in vitro, mesenchymal stromal cells delayed but did not prevent graft-versus-host disease in the major histocompatibility complex-mismatched model. In the sibling transplant model, however, 30% of mesenchymal stromal cell-treated mice did not develop graft-versus-host disease. The timing of administration and dose of the mesenchymal stromal cells influenced their effectiveness in attenuating graft-versus-host disease, such that a low dose of mesenchymal stromal cells administered early was more effective than a high dose of mesenchymal stromal cells given late. Compared to control-treated mice, mesenchymal stromal cell-treated mice had significant reductions in serum and splenic interferon-γ, an important mediator of graft-versus-host disease. Conclusions: Mesenchymal stromal cells appear to delay death from graft-versus-host disease by transiently altering the inflammatory milieu and reducing levels of interferon-γ. Our data suggest that both the timing of infusion and the dose of mesenchymal stromal cells likely influence these cells’ effectiveness in attenuating graft-versus-host disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background We hypothesised that alternating inhibitors of the vascular endothelial growth factor receptor (VEGFR) and mammalian target of rapamycin pathways would delay the development of resistance in advanced renal cell carcinoma (aRCC). Patients and methods A single-arm, two-stage, multicentre, phase 2 trial to determine the activity, feasibility, and safety of 12-week cycles of sunitinib 50 mg daily 4 weeks on / 2 weeks off, alternating with everolimus 10 mg daily for 5 weeks on / 1 week off, until disease progression or prohibitive toxicity in favourable or intermediate-risk aRCC. The primary end point was proportion alive and progression-free at 6 months (PFS6m). The secondary end points were feasibility, tumour response, overall survival (OS), and adverse events (AEs). The correlative objective was to assess biomarkers and correlate with clinical outcome. Results We recruited 55 eligible participants from September 2010 to August 2012. Demographics: mean age 61, 71% male, favourable risk 16%, intermediate risk 84%. Cycle 2 commenced within 14 weeks for 80% of participants; 64% received ≥22 weeks of alternating therapy; 78% received ≥22 weeks of any treatment. PFS6m was 29/55 (53%; 95% confidence interval [CI] 40% to 66%). Tumour response rate was 7/55 (13%; 95% CI 4% to 22%, all partial responses). After median follow-up of 20 months, 47 of 55 (86%) had progressed with a median progression-free survival of 8 months (95% CI 5–10), and 30 of 55 (55%) had died with a median OS of 17 months (95% CI 12–undefined). AEs were consistent with those expected for each single agent. No convincing prognostic biomarkers were identified. Conclusions The EVERSUN regimen was feasible and safe, but its activity did not meet pre-specified values to warrant further research. This supports the current approach of continuing anti-VEGF therapy until progression or prohibitive toxicity before changing treatment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Strand specific RNAseq data is now more common in RNAseq projects. Visualizing RNAseq data has become an important matter in Analysis of sequencing data. The most widely used visualization tool is the UCSC genome browser that introduced the custom track concept that enabled researchers to simultaneously visualize gene expression at a particular locus from multiple experiments. Our objective of the software tool is to provide friendly interface for visualization of RNAseq datasets. Results This paper introduces a visualization tool (RNASeqBrowser) that incorporates and extends the functionality of the UCSC genome browser. For example, RNASeqBrowser simultaneously displays read coverage, SNPs, InDels and raw read tracks with other BED and wiggle tracks -- all being dynamically built from the BAM file. Paired reads are also connected in the browser to enable easier identification of novel exon/intron borders and chimaeric transcripts. Strand specific RNAseq data is also supported by RNASeqBrowser that displays reads above (positive strand transcript) or below (negative strand transcripts) a central line. Finally, RNASeqBrowser was designed for ease of use for users with few bioinformatic skills, and incorporates the features of many genome browsers into one platform. Conclusions The features of RNASeqBrowser: (1) RNASeqBrowser integrates UCSC genome browser and NGS visualization tools such as IGV. It extends the functionality of the UCSC genome browser by adding several new types of tracks to show NGS data such as individual raw reads, SNPs and InDels. (2) RNASeqBrowser can dynamically generate RNA secondary structure. It is useful for identifying non-coding RNA such as miRNA. (3) Overlaying NGS wiggle data is helpful in displaying differential expression and is simple to implement in RNASeqBrowser. (4) NGS data accumulates a lot of raw reads. Thus, RNASeqBrowser collapses exact duplicate reads to reduce visualization space. Normal PC’s can show many windows of NGS individual raw reads without much delay. (5) Multiple popup windows of individual raw reads provide users with more viewing space. This avoids existing approaches (such as IGV) which squeeze all raw reads into one window. This will be helpful for visualizing multiple datasets simultaneously. RNASeqBrowser and its manual are freely available at http://www.australianprostatecentre.org/research/software/rnaseqbrowser webcite or http://sourceforge.net/projects/rnaseqbrowser/ webcite

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to ever increasing climate instability, the number of natural disasters affecting society and communities is expected to increase globally in the future, which will result in a growing number of casualties and damage to property and infrastructure. Such damage poses crucial challenges for recovery of interdependent critical infrastructures. Post-disaster reconstruction is a complex undertaking as it is not only closely linked to the well-being and essential functioning of society, but also requires a large financial commitment. Management of critical infrastructure during post-disaster recovery needs to be underpinned by a holistic recognition that the recovery of each individual infrastructure system (e.g. energy, water, transport and information and communication technology) can be affected by the interdependencies that exist between these different systems. A fundamental characteristic of these interdependencies is that failure of one critical infrastructure system can result in the failure of other interdependent infrastructures, leading to a cascade of failures, which can impede post-disaster recovery and delay the subsequent reconstruction process. Consequently, there is a critical need for developing a holistic strategy to assess the influence of infrastructure interdependencies, and for incorporating these interdependencies into a post-disaster recovery strategy. This paper discusses four key dimensions of interdependencies that need to be considered in a post-disaster reconstruction planning. Using key concepts and sub-concepts derived from the notion of interdependency, the paper examines how critical infrastructure interdependencies affect the recovery processes of damaged infrastructures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traffic law enforcement sanctions can impact on road user behaviour through general and specific deterrence mechanisms. The manner in which specific deterrence can influence recidivist behaviour can be conceptualised in different ways. While any reduction in speeding will have road safety benefits, the ways in which a ‘reduction’ is determined deserves greater methodological attention and has implications for countermeasure evaluation more generally. The primary aim of this research was to assess the specific deterrent impact of penalty increases for speeding offences in Queensland, Australia, in 2003 on two cohorts of drivers detected for speeding prior to and after the penalty changes were investigated. Since the literature is relatively silent on how to assess recidivism in the speeding context, the secondary research aim was to contribute to the literature regarding ways to conceptualise and measure specific deterrence in the speeding context. We propose a novel way of operationalising four measures which reflect different ways in which a specific deterrence effect could be conceptualised: (1) the proportion of offenders who re-offended in the follow up period; (2) the overall frequency of re-offending in the follow up period; (3) the length of delay to re-offence among those who re-offended; and (4) the average number of re-offences during the follow up period among those who re-offended. Consistent with expectations, results suggested an absolute deterrent effect of penalty changes, as evidenced by significant reductions in the proportion of drivers who re-offended and the overall frequency of re-offending, although effect sizes were small. Contrary to expectations, however, there was no evidence of a marginal specific deterrent effect among those who re-offended, with a significant reduction in the length of time to re-offence and no significant change in the average number of offences committed. Additional exploratory analyses investigating potential influences of the severity of the index offence, offence history, and method of detection revealed mixed results. Access to additional data from various sources suggested that the main findings were not influenced by changes in speed enforcement activity, public awareness of penalty changes, or driving exposure during the study period. Study limitations and recommendations for future research are discussed with a view to promoting more extensive evaluations of penalty changes and better understanding of how such changes may impact on motorists’ perceptions of enforcement and sanctions, as well as on recidivist behaviour.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates communication protocols for relaying sensor data from animal tracking applications back to base stations. While Delay Tolerant Networks (DTNs) are well suited to such challenging environments, most existing protocols do not consider the available energy that is particularly important when tracking devices can harvest energy. This limits both the network lifetime and delivery probability in energy-constrained applications to the point when routing performance becomes worse than using no routing at all. Our work shows that substantial improvement in data yields can be achieved through simple yet efficient energy-aware strategies. Conceptually, there is need for balancing the energy spent on sensing, data mulling, and delivery of direct packets to destination. We use empirical traces collected in a flying fox (fruit bat) tracking project and show that simple threshold-based energy-aware strategies yield up to 20% higher delivery rates. Furthermore, these results generalize well for a wide range of operating conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The importance of developing effective disaster management strategies has significantly grown as the world continues to be confronted with unprecedented disastrous events. Factors such as climate instability, recent urbanization along with rapid population growth in many cities around the world have unwittingly exacerbated the risks of potential disasters, leaving a large number of people and infrastructure exposed to new forms of threats from natural disasters such as flooding, cyclones, and earthquakes. With disasters on the rise, effective recovery planning of the built environment is becoming imperative as it is not only closely related to the well-being and essential functioning of society, but it also requires significant financial commitment. In the built environment context, post-disaster reconstruction focuses essentially on the repair and reconstruction of physical infrastructures. The reconstruction and rehabilitation efforts are generally performed in the form of collaborative partnerships that involve multiple organisations, enabling the restoration of interdependencies that exist between infrastructure systems such as energy, water (including wastewater), transport, and telecommunication systems. These interdependencies are major determinants of vulnerabilities and risks encountered by critical infrastructures and therefore have significant implications for post-disaster recovery. When disrupted by natural disasters, such interdependencies have the potential to promote the propagation of failures between critical infrastructures at various levels, and thus can have dire consequences on reconstruction activities. This paper outlines the results of a pilot study on how elements of infrastructure interdependencies have the potential to impede the post-disaster recovery effort. Using a set of unstructured interview questionnaires, plausible arguments provided by seven respondents revealed that during post-disaster recovery, critical infrastructures are mutually dependent on each other’s uninterrupted availability, both physically and through a host of information and communication technologies. Major disruption to their physical and cyber interdependencies could lead to cascading failures, which could delay the recovery effort. Thus, the existing interrelationship between critical infrastructures requires that the entire interconnected network be considered when managing reconstruction activities during the post-disaster recovery period.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Provision of network infrastructure to meet rising network peak demand is increasing the cost of electricity. Addressing this demand is a major imperative for Australian electricity agencies. The network peak demand model reported in this paper provides a quantified decision support tool and a means of understanding the key influences and impacts on network peak demand. An investigation of the system factors impacting residential consumers’ peak demand for electricity was undertaken in Queensland, Australia. Technical factors, such as the customers’ location, housing construction and appliances, were combined with social factors, such as household demographics, culture, trust and knowledge, and Change Management Options (CMOs) such as tariffs, price,managed supply, etc., in a conceptual ‘map’ of the system. A Bayesian network was used to quantify the model and provide insights into the major influential factors and their interactions. The model was also used to examine the reduction in network peak demand with different market-based and government interventions in various customer locations of interest and investigate the relative importance of instituting programs that build trust and knowledge through well designed customer-industry engagement activities. The Bayesian network was implemented via a spreadsheet with a tick box interface. The model combined available data from industry-specific and public sources with relevant expert opinion. The results revealed that the most effective intervention strategies involve combining particular CMOs with associated education and engagement activities. The model demonstrated the importance of designing interventions that take into account the interactions of the various elements of the socio-technical system. The options that provided the greatest impact on peak demand were Off-Peak Tariffs and Managed Supply and increases in the price of electricity. The impact in peak demand reduction differed for each of the locations and highlighted that household numbers, demographics as well as the different climates were significant factors. It presented possible network peak demand reductions which would delay any upgrade of networks, resulting in savings for Queensland utilities and ultimately for households. The use of this systems approach using Bayesian networks to assist the management of peak demand in different modelled locations in Queensland provided insights about the most important elements in the system and the intervention strategies that could be tailored to the targeted customer segments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The window of opportunity is a concept critical to rheumatoid arthritis treatment. Early treatment changes the outcome of rheumatoid arthritis treatment, in that response rates are higher with earlier disease-modifying anti-rheumatic drug treatment and damage is substantially reduced. Axial spondyloarthritis is an inflammatory axial disease encompassing both nonradiographic axial spondyloarthritis and established ankylosing spondylitis. In axial spondyloarthritis, studies of magnetic resonance imaging as well as tumor necrosis factor inhibitor treatment and withdrawal studies all suggest that early effective suppression of inflammation has the potential to reduce radiographic damage. This potential would suggest that the concept of a window of opportunity is relevant not only to rheumatoid arthritis but also to axial spondyloarthritis. The challenge now remains to identify high-risk patients early and to commence treatment without delay. Developments in risk stratification include new classification criteria, identification of clinical risk factors, biomarkers, genetic associations, potential antibody associations and an ankylosing spondylitis-specific microbiome signature. Further research needs to focus on the evidence for early intervention and the early identification of high-risk individuals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Genetic testing is recommended when the probability of a disease-associated germline mutation exceeds 10%. Germline mutations are found in approximately 25% of individuals with phaeochromcytoma (PCC) or paraganglioma (PGL); however, genetic heterogeneity for PCC/PGL means many genes may require sequencing. A phenotype-directed iterative approach may limit costs but may also delay diagnosis, and will not detect mutations in genes not previously associated with PCC/PGL. Objective To assess whether whole exome sequencing (WES) was efficient and sensitive for mutation detection in PCC/PGL. Methods Whole exome sequencing was performed on blinded samples from eleven individuals with PCC/PGL and known mutations. Illumina TruSeq™ (Illumina Inc, San Diego, CA, USA) was used for exome capture of seven samples, and NimbleGen SeqCap EZ v3.0 (Roche NimbleGen Inc, Basel, Switzerland) for five samples (one sample was repeated). Massive parallel sequencing was performed on multiplexed samples. Sequencing data were called using Genome Analysis Toolkit and annotated using annovar. Data were assessed for coding variants in RET, NF1, VHL, SDHD, SDHB, SDHC, SDHA, SDHAF2, KIF1B, TMEM127, EGLN1 and MAX. Target capture of five exome capture platforms was compared. Results Six of seven mutations were detected using Illumina TruSeq™ exome capture. All five mutations were detected using NimbleGen SeqCap EZ v3.0 platform, including the mutation missed using Illumina TruSeq™ capture. Target capture for exons in known PCC/PGL genes differs substantially between platforms. Exome sequencing was inexpensive (<$A800 per sample for reagents) and rapid (results <5 weeks from sample reception). Conclusion Whole exome sequencing is sensitive, rapid and efficient for detection of PCC/PGL germline mutations. However, capture platform selection is critical to maximize sensitivity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The restructuring of the crop agriculture industry over the past two decades has enabled patent holders to exclude, prevent and deter others from using certain research tools and delay or block further follow-on inventions

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Earlier studies have shown that the speed of information transmission developed radically during the 19th century. The fast development was mainly due to the change from sailing ships and horse-driven coaches to steamers and railways, as well as the telegraph. Speed of information transmission has normally been measured by calculating the duration between writing and receiving a letter, or between an important event and the time when the news was published elsewhere. As overseas mail was generally carried by ships, the history of communications and maritime history are closely related. This study also brings a postal historical aspect to the academic discussion. Additionally, there is another new aspect included. In business enterprises, information flows generally consisted of multiple transactions. Although fast one-way information was often crucial, e.g. news of a changing market situation, at least equally important was that there was a possibility to react rapidly. To examine the development of business information transmission, the duration of mail transport has been measured by a systematic and commensurable method, using consecutive information circles per year as the principal tool for measurement. The study covers a period of six decades, several of the world's most important trade routes and different mail-carrying systems operated by merchant ships, sailing packets and several nations' steamship services. The main sources have been the sailing data of mail-carrying ships and correspondence of several merchant houses in England. As the world's main trade routes had their specific historical backgrounds with different businesses, interests and needs, the systems for information transmission did not develop similarly or simultaneously. It was a process lasting several decades, initiated by the idea of organizing sailings in a regular line system. The evolution proceeded generally as follows: originally there was a more or less irregular system, then a regular system and finally a more frequent regular system of mail services. The trend was from sail to steam, but both these means of communication improved following the same scheme. Faster sailings alone did not radically improve the number of consecutive information circles per year, if the communication was not frequent enough. Neither did improved frequency advance the information circulation if the trip was very long or if the sailings were overlapping instead of complementing each other. The speed of information transmission could be improved by speeding up the voyage itself (technological improvements, minimizing the waiting time at ports of call, etc.) but especially by organizing sailings so that the recipients had the possibility to reply to arriving mails without unnecessary delay. It took two to three decades before the mail-carrying shipping companies were able to organize their sailings in an optimal way. Strategic shortcuts over isthmuses (e.g. Panama, Suez) together with the cooperation between steamships and railways enabled the most effective improvements in global communications before the introduction of the telegraph.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

New antiretroviral drugs that offer large genetic barriers to resistance, such as the recently approved inhibitors of HIV-1 protease, tipranavir and darunavir, present promising weapons to avert the failure of current therapies for HIV infection. Optimal treatment strategies with the new drugs, however, are yet to be established. A key limitation is the poor understanding of the process by which HIV surmounts large genetic barriers to resistance. Extant models of HIV dynamics are predicated on the predominance of deterministic forces underlying the emergence of resistant genomes. In contrast, stochastic forces may dominate, especially when the genetic barrier is large, and delay the emergence of resistant genomes. We develop a mathematical model of HIV dynamics under the influence of an antiretroviral drug to predict the waiting time for the emergence of genomes that carry the requisite mutations to overcome the genetic barrier of the drug. We apply our model to describe the development of resistance to tipranavir in in vitro serial passage experiments. Model predictions of the times of emergence of different mutant genomes with increasing resistance to tipranavir are in quantitative agreement with experiments, indicating that our model captures the dynamics of the development of resistance to antiretroviral drugs accurately. Further, model predictions provide insights into the influence of underlying evolutionary processes such as recombination on the development of resistance, and suggest guidelines for drug design: drugs that offer large genetic barriers to resistance with resistance sites tightly localized on the viral genome and exhibiting positive epistatic interactions maximally inhibit the emergence of resistant genomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we generalize the existing rate-one space frequency (SF) and space-time frequency (STF) code constructions. The objective of this exercise is to provide a systematic design of full-diversity STF codes with high coding gain. Under this generalization, STF codes are formulated as linear transformations of data. Conditions on these linear transforms are then derived so that the resulting STF codes achieve full diversity and high coding gain with a moderate decoding complexity. Many of these conditions involve channel parameters like delay profile (DP) and temporal correlation. When these quantities are not available at the transmitter, design of codes that exploit full diversity on channels with arbitrary DIP and temporal correlation is considered. Complete characterization of a class of such robust codes is provided and their bit error rate (BER) performance is evaluated. On the other hand, when channel DIP and temporal correlation are available at the transmitter, linear transforms are optimized to maximize the coding gain of full-diversity STF codes. BER performance of such optimized codes is shown to be better than those of existing codes.