663 resultados para Response prediction


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Genomic instability underlies the transformation of host cells toward malignancy, promotes development of invasion and metastasis and shapes the response of established cancer to treatment. In this review, we discuss recent advances in our understanding of genomic stability in squamous cell carcinoma of the head and neck (HNSCC), with an emphasis on DNA repair pathways. HNSCC is characterized by distinct profiles in genome stability between similarly staged cancers that are reflected in risk, treatment response and outcomes. Defective DNA repair generates chromosomal derangement that can cause subsequent alterations in gene expression, and is a hallmark of progression toward carcinoma. Variable functionality of an increasing spectrum of repair gene polymorphisms is associated with increased cancer risk, while aetiological factors such as human papillomavirus, tobacco and alcohol induce significantly different behaviour in induced malignancy, underpinned by differences in genomic stability. Targeted inhibition of signalling receptors has proven to be a clinically-validated therapy, and protein expression of other DNA repair and signalling molecules associated with cancer behaviour could potentially provide a more refined clinical model for prognosis and treatment prediction. Development and expansion of current genomic stability models is furthering our understanding of HNSCC pathophysiology and uncovering new, promising treatment strategies. © 2013 Glenn Jenkins et al.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the response of pile foundations to ground shocks induced by surface explosion using fully coupled and non-linear dynamic computer simulation techniques together with different material models for the explosive, air, soil and pile. It uses the Arbitrary Lagrange Euler coupling formulation with proper state material parameters and equations. Blast wave propagation in soil, horizontal pile deformation and pile damage are presented to facilitate failure evaluation of piles. Effects of end restraint of pile head and the number and spacing of piles within a group on their blast response and potential failure are investigated. The techniques developed and applied in this paper and its findings provide valuable information on the blast response and failure evaluation of piles and will provide guidance in their future analysis and design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dermal wound repair involves complex interactions between cells, cytokines and mechanics to close injuries to the skin. In particular, we investigate the contribution of fibroblasts, myofibroblasts, TGFβ, collagen and local tissue mechanics to wound repair in the human dermis. We develop a morphoelastic model where a realistic representation of tissue mechanics is key, and a fibrocontractive model that involves a reasonable approximation to the true kinetics of the important bioactive species. We use each of these descriptions to elucidate the mechanisms that generate pathologies such as hypertrophic scars, contractures and keloids. We find that for hypertrophic scar and contracture development, factors regulating the myofibroblast phenotype are critical, with heightened myofibroblast activation, reduced myofibroblast apoptosis or prolonged inflammation all predicted as mediators for scar hypertrophy and contractures. Prevention of these pathologies is predicted when myofibroblast apoptosis is induced, myofibroblast activation is blocked or TGFβ is neutralised. To investigate keloid invasion, we develop a caricature representation of the fibrocontractive model and find that TGFβ spread is the driving factor behind keloid growth. Blocking activation of TGFβ is found to cause keloid regression. Thus, we recommend myofibroblasts and TGFβ as targets for clinicians when developing intervention strategies for prevention and cure of fibrotic scars.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In our rejoinder to Don Weatherburn's paper, “Law and Order Blues”, we do not take issue with his advocacy of the need to take crime seriously and to foster a more rational approach to the problems it poses. Where differences do emerge is (1) with his claim that he is willing to do so whilst we (in our different ways) are not; and (2) on the question of what this involves. Of particular concern is the way in which his argument proceeds by a combination of simple misrepresentation of the positions it seeks to disparage, and silence concerning issues of real substance where intellectual debate and exchange would be welcome and useful. Our paper challenges, in turn, the misrepresentation of Indermaur's analysis of trends in violent crime, the misrepresentation of Hogg and Brown's Rethinking Law and Order, the misrepresentation of the findings of some of the research into the effectiveness of punitive policies and the silence on sexual assault in “Law and Order Blues”. We suggest that his silence on sexual assault reflects a more widespread unwillingness to acknowledge the methodological problems that arise in the measurement of crime because such problems severely limit the extent to which confident assertions can be made about prevalence and trends.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Chinese government should be commended for its open, concerted, and rapid response to the recent H7N9 influenza outbreak. However, the first known case was not reported until 48 days after disease onset.1 Although the difficulties in detecting the virus and the lack of suitable diagnostic methods have been the focus of discussion,2 systematic limitations that may have contributed to this delay have hardly been discussed. The detection speed of surveillance systems is limited by the highly structured nature of information flow and hierarchical organisation of these systems. Flu surveillance usually relies on notification to a central authority of laboratory confirmed cases or presentations to sentinel practices for flu-like illness. Each step in this pathway presents a bottleneck at which information and time can be lost; this limitation must be dealt with...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Due to the health impacts caused by exposures to air pollutants in urban areas, monitoring and forecasting of air quality parameters have become popular as an important topic in atmospheric and environmental research today. The knowledge on the dynamics and complexity of air pollutants behavior has made artificial intelligence models as a useful tool for a more accurate pollutant concentration prediction. This paper focuses on an innovative method of daily air pollution prediction using combination of Support Vector Machine (SVM) as predictor and Partial Least Square (PLS) as a data selection tool based on the measured values of CO concentrations. The CO concentrations of Rey monitoring station in the south of Tehran, from Jan. 2007 to Feb. 2011, have been used to test the effectiveness of this method. The hourly CO concentrations have been predicted using the SVM and the hybrid PLS–SVM models. Similarly, daily CO concentrations have been predicted based on the aforementioned four years measured data. Results demonstrated that both models have good prediction ability; however the hybrid PLS–SVM has better accuracy. In the analysis presented in this paper, statistic estimators including relative mean errors, root mean squared errors and the mean absolute relative error have been employed to compare performances of the models. It has been concluded that the errors decrease after size reduction and coefficients of determination increase from 56 to 81% for SVM model to 65–85% for hybrid PLS–SVM model respectively. Also it was found that the hybrid PLS–SVM model required lower computational time than SVM model as expected, hence supporting the more accurate and faster prediction ability of hybrid PLS–SVM model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A victim of phishing emails could be subjected to money loss and identity theft. This paper investigates the different types of phishing email victims, with the goal of increasing such victims' defences. To obtain this kind of information, an experiment which involves sending a phishing email to participants is conducted. Quantitative and qualitative methods are also used to collect users' information. A model for detecting deception has been employed to understand victims' behaviour. This paper reports the qualitative results. The findings suggest that victims of phishing emails do not always exhibit the same vulnerability. The cause of being a victim is a result of three weaknesses in the detection process: (1) lack of knowledge; (2) weak confirmation channel, and; (3) victims' high propensity towards risk-taking. Therefore, it is suggested that users be provided with suitable confirmation channels and be more risk averse in their behaviour so that they would not fall victim to phishing emails.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this work is to develop a demand-side-response model, which assists electricity consumers exposed to the market price to independently and proactively manage air-conditioning peak electricity demand. The main contribution of this research is to show how consumers can optimize the energy cost caused by the air conditioning load considering to several cases e.g. normal price, spike price, and the probability of a price spike case. This model also investigated how air-conditioning applies a pre-cooling method when there is a substantial risk of a price spike. The results indicate the potential of the scheme to achieve financial benefits for consumers and target the best economic performance for electrical generation distribution and transmission. The model was tested with Queensland electricity market data from the Australian Energy Market Operator and Brisbane temperature data from the Bureau of Statistics regarding hot days from 2011 to 2012.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In hyper competition, firms that are agile: sensing and responding better to customer requirements tend to be more successful and achieve supernormal profits. In spite of the widely accepted importance of customer agility, research is limited on this construct. The limited research also has predominantly focussed on the firm’s perspective of agility. However, we propose that the customers are better positioned to determine how well a firm is responding to their requirements (aka a firm’s customer agility). Taking the customers’ stand point, we address the issue of sense and respond alignment in two perspectives-matching and mediating. Based on data collected from customers in a field study, we tested hypothesis pertaining to the two methods of alignment using polynomial regression and response surface methodology. The results provide a good explanation for the role of both forms of alignment on customer satisfaction. Implication for research and practice are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

MapReduce frameworks such as Hadoop are well suited to handling large sets of data which can be processed separately and independently, with canonical applications in information retrieval and sales record analysis. Rapid advances in sequencing technology have ensured an explosion in the availability of genomic data, with a consequent rise in the importance of large scale comparative genomics, often involving operations and data relationships which deviate from the classical Map Reduce structure. This work examines the application of Hadoop to patterns of this nature, using as our focus a wellestablished workflow for identifying promoters - binding sites for regulatory proteins - Across multiple gene regions and organisms, coupled with the unifying step of assembling these results into a consensus sequence. Our approach demonstrates the utility of Hadoop for problems of this nature, showing how the tyranny of the "dominant decomposition" can be at least partially overcome. It also demonstrates how load balance and the granularity of parallelism can be optimized by pre-processing that splits and reorganizes input files, allowing a wide range of related problems to be brought under the same computational umbrella.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study assessed the revised Behavioural Inhibition System (BIS), as conceptualised by Gray and McNaughton’s (2000) revised RST, by exposing participants to a loss-framed road safety message (emphasising the negative consequences of speeding behaviour) and a high performance motor vehicle promotional advertisement. Licensed young drivers (N = 40, aged 17–25 years) were randomly allocated to view either the message or both the message and advertisement. Participants then completed a computerised lexical decision task prior to completing three personality measures: Corr-Cooper RST-PQ, CARROT and Q-Task. It was predicted that those with a stronger BIS would demonstrate greater processing of these mixed message cues compared to weaker BIS individuals, and that this BIS effect would only be observed in the mixed cues condition (due to simultaneous activation of the incentive and punishment systems). Preliminary findings will be discussed in the context of the influence of personality traits on health message processing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives The goal of this article is to examine whether or not the results of the Queensland Community Engagement Trial (QCET)-a randomized controlled trial that tested the impact of procedural justice policing on citizen attitudes toward police-were affected by different types of nonresponse bias. Method We use two methods (Cochrane and Elffers methods) to explore nonresponse bias: First, we assess the impact of the low response rate by examining the effects of nonresponse group differences between the experimental and control conditions and pooled variance under different scenarios. Second, we assess the degree to which item response rates are influenced by the control and experimental conditions. Results Our analysis of the QCET data suggests that our substantive findings are not influenced by the low response rate in the trial. The results are robust even under extreme conditions, and statistical significance of the results would only be compromised in cases where the pooled variance was much larger for the nonresponse group and the difference between experimental and control conditions was greatly diminished. We also find that there were no biases in the item response rates across the experimental and control conditions. Conclusion RCTs that involve field survey responses-like QCET-are potentially compromised by low response rates and how item response rates might be influenced by the control or experimental conditions. Our results show that the QCET results were not sensitive to the overall low response rate across the experimental and control conditions and the item response rates were not significantly different across the experimental and control groups. Overall, our analysis suggests that the results of QCET are robust and any biases in the survey responses do not significantly influence the main experimental findings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article presents new theoretical and empirical evidence on the forecasting ability of prediction markets. We develop a model that predicts that the time until expiration of a prediction market should negatively affect the accuracy of prices as a forecasting tool in the direction of a ‘favourite/longshot bias’. That is, high-likelihood events are underpriced, and low-likelihood events are over-priced. We confirm this result using a large data set of prediction market transaction prices. Prediction markets are reasonably well calibrated when time to expiration is relatively short, but prices are significantly biased for events farther in the future. When time value of money is considered, the miscalibration can be exploited to earn excess returns only when the trader has a relatively low discount rate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is increasingly apparent that sea-level data (e.g. microfossil transfer functions, dated coral microatolls and direct observations from satellite and tidal gauges) vary temporally and spatially at regional to local scales, thus limiting our ability to model future sea-level rise for many regions. Understanding sealevel response at ‘far-field’ locations at regional scales is fundamental for formulating more relevant sea-level rise susceptibility models within these regions under future global change projections. Fossil corals and reefs in particular are valuable tools for reconstructing past sea levels and possible environmental phase shifts beyond the temporal constraints of instrumental records. This study used abundant surface geochronological data based on in situ subfossil corals and precise elevation surveys to determine previous sea level in Moreton Bay, eastern Australia, a far-field site. A total of 64 U-Th dates show that relative sea level was at least 1.1 m above modern lowest astronomical tide (LAT) from at least ˜6600 cal. yr BP. Furthermore, a rapid synchronous demise in coral reef growth occurred in Moreton Bay ˜5800 cal. yr BP, coinciding with reported reef hiatus periods in other areas around the Indo-Pacific region. Evaluating past reef growth patterns and phases allows for a better interpretation of anthropogenic forcing versus natural environmental/climatic cycles that effect reef formation and demise at all scales and may allow better prediction of reef response to future global change.