733 resultados para RIGHT-CENSORED DATA


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The FLEX study demonstrated that the addition of cetuximab to chemotherapy significantly improved overall survival in the first-line treatment of patients with advanced non-small cell lung cancer (NSCLC). In the FLEX intention to treat (ITT) population, we investigated the prognostic significance of particular baseline characteristics. Individual patient data from the treatment arms of the ITT population of the FLEX study were combined. Univariable and multivariable Cox regression models were used to investigate variables with potential prognostic value. The ITT population comprised 1125 patients. In the univariable analysis, longer median survival times were apparent for females compared with males (12.7 vs 9.3 months); patients with an Eastern Cooperative Oncology Group performance status (ECOG PS) of 0 compared with 1 compared with 2 (13.5 vs 10.6 vs 5.9 months); never smokers compared with former smokers compared with current smokers (14.6 vs 11.1 vs 9.0); Asians compared with Caucasians (19.5 vs 9.6 months); patients with adenocarcinoma compared with squamous cell carcinoma (12.4 vs 9.3 months) and those with metastases to one site compared with two sites compared with three or more sites (12.4 months vs 9.8 months vs 6.4 months). Age (<65 vs ≥65 years), tumor stage (IIIB with pleural effusion vs IV) and percentage of tumor cells expressing EGFR (<40% vs ≥40%) were not identified as possible prognostic factors in relation to survival time. In multivariable analysis, a stepwise selection procedure identified age (<65 vs ≥65 years), gender, ECOG PS, smoking status, region, tumor histology, and number of organs involved as independent factors of prognostic value. In summary, in patients with advanced NSCLC enrolled in the FLEX study, and consistent with previous analyses, particular patient and disease characteristics at baseline were shown to be independent factors of prognostic value. The FLEX study is registered with ClinicalTrials.gov, number NCT00148798. © 2012 Elsevier Ireland Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Findings from the phase 3 First-Line ErbituX in lung cancer (FLEX) study showed that the addition of cetuximab to first-line chemotherapy significantly improved overall survival compared with chemotherapy alone (hazard ratio [HR] 0·871, 95% CI 0·762-0·996; p=0·044) in patients with advanced non-small-cell lung cancer (NSCLC). To define patients benefiting most from cetuximab, we studied the association of tumour EGFR expression level with clinical outcome in FLEX study patients. Methods: We used prospectively collected tumour EGFR expression data to generate an immunohistochemistry score for FLEX study patients on a continuous scale of 0-300. We used response data to select an outcome-based discriminatory threshold immunohistochemistry score for EGFR expression of 200. Treatment outcome was analysed in patients with low (immunohistochemistry score <200) and high (≥200) tumour EGFR expression. The primary endpoint in the FLEX study was overall survival. We analysed patients from the FLEX intention-to-treat (ITT) population. The FLEX study is registered with ClinicalTrials.gov, number NCT00148798. Findings: Tumour EGFR immunohistochemistry data were available for 1121 of 1125 (99·6%) patients from the FLEX study ITT population. High EGFR expression was scored for 345 (31%) evaluable patients and low for 776 (69%) patients. For patients in the high EGFR expression group, overall survival was longer in the chemotherapy plus cetuximab group than in the chemotherapy alone group (median 12·0 months [95% CI 10·2-15·2] vs 9·6 months [7·6-10·6]; HR 0·73, 0·58-0·93; p=0·011), with no meaningful increase in side-effects. We recorded no corresponding survival benefit for patients in the low EGFR expression group (median 9·8 months [8·9-12·2] vs 10·3 months [9·2-11·5]; HR 0·99, 0·84-1·16; p=0·88). A treatment interaction test assessing the difference in the HRs for overall survival between the EGFR expression groups suggested a predictive value for EGFR expression (p=0·044). Interpretation: High EGFR expression is a tumour biomarker that can predict survival benefit from the addition of cetuximab to first-line chemotherapy in patients with advanced NSCLC. Assessment of EGFR expression could offer a personalised treatment approach in this setting. Funding: Merck KGaA. © 2012 Elsevier Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background The expansion of cell colonies is driven by a delicate balance of several mechanisms including cell motility, cell-to-cell adhesion and cell proliferation. New approaches that can be used to independently identify and quantify the role of each mechanism will help us understand how each mechanism contributes to the expansion process. Standard mathematical modelling approaches to describe such cell colony expansion typically neglect cell-to-cell adhesion, despite the fact that cell-to-cell adhesion is thought to play an important role. Results We use a combined experimental and mathematical modelling approach to determine the cell diffusivity, D, cell-to-cell adhesion strength, q, and cell proliferation rate, ?, in an expanding colony of MM127 melanoma cells. Using a circular barrier assay, we extract several types of experimental data and use a mathematical model to independently estimate D, q and ?. In our first set of experiments, we suppress cell proliferation and analyse three different types of data to estimate D and q. We find that standard types of data, such as the area enclosed by the leading edge of the expanding colony and more detailed cell density profiles throughout the expanding colony, does not provide sufficient information to uniquely identify D and q. We find that additional data relating to the degree of cell-to-cell clustering is required to provide independent estimates of q, and in turn D. In our second set of experiments, where proliferation is not suppressed, we use data describing temporal changes in cell density to determine the cell proliferation rate. In summary, we find that our experiments are best described using the range D = 161 - 243 ?m2 hour-1, q = 0.3 - 0.5 (low to moderate strength) and ? = 0.0305 - 0.0398 hour-1, and with these parameters we can accurately predict the temporal variations in the spatial extent and cell density profile throughout the expanding melanoma cell colony. Conclusions Our systematic approach to identify the cell diffusivity, cell-to-cell adhesion strength and cell proliferation rate highlights the importance of integrating multiple types of data to accurately quantify the factors influencing the spatial expansion of melanoma cell colonies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of hedonic models to estimate the effects of various factors on house prices is well established. This paper examines a number of international hedonic house price models that seek to quantify the effect of infrastructure charges on new house prices. This work is an important factor in the housing affordability debate, with many governments in high growth areas having user-pays infrastructure charging policies operating in tandem with housing affordability objectives, with no empirical evidence on the impact of one on the other. This research finds there is little consistency between existing models and the data sets utilised. Specification appears dependent upon data availability rather than sound theoretical grounding. This may lead to a lack of external validity with model specification dependent upon data availability rather than sound theoretical grounding.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Spatial organisation of proteins according to their function plays an important role in the specificity of their molecular interactions. Emerging proteomics methods seek to assign proteins to sub-cellular locations by partial separation of organelles and computational analysis of protein abundance distributions among partially separated fractions. Such methods permit simultaneous analysis of unpurified organelles and promise proteome-wide localisation in scenarios wherein perturbation may prompt dynamic re-distribution. Resolving organelles that display similar behavior during a protocol designed to provide partial enrichment represents a possible shortcoming. We employ the Localisation of Organelle Proteins by Isotope Tagging (LOPIT) organelle proteomics platform to demonstrate that combining information from distinct separations of the same material can improve organelle resolution and assignment of proteins to sub-cellular locations. Two previously published experiments, whose distinct gradients are alone unable to fully resolve six known protein-organelle groupings, are subjected to a rigorous analysis to assess protein-organelle association via a contemporary pattern recognition algorithm. Upon straightforward combination of single-gradient data, we observe significant improvement in protein-organelle association via both a non-linear support vector machine algorithm and partial least-squares discriminant analysis. The outcome yields suggestions for further improvements to present organelle proteomics platforms, and a robust analytical methodology via which to associate proteins with sub-cellular organelles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis describes the development of a robust and novel prototype to address the data quality problems that relate to the dimension of outlier data. It thoroughly investigates the associated problems with regards to detecting, assessing and determining the severity of the problem of outlier data; and proposes granule-mining based alternative techniques to significantly improve the effectiveness of mining and assessing outlier data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Acoustic sensing is a promising approach to scaling faunal biodiversity monitoring. Scaling the analysis of audio collected by acoustic sensors is a big data problem. Standard approaches for dealing with big acoustic data include automated recognition and crowd based analysis. Automatic methods are fast at processing but hard to rigorously design, whilst manual methods are accurate but slow at processing. In particular, manual methods of acoustic data analysis are constrained by a 1:1 time relationship between the data and its analysts. This constraint is the inherent need to listen to the audio data. This paper demonstrates how the efficiency of crowd sourced sound analysis can be increased by an order of magnitude through the visual inspection of audio visualized as spectrograms. Experimental data suggests that an analysis speedup of 12× is obtainable for suitable types of acoustic analysis, given that only spectrograms are shown.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Early career engineering academics are encouraged to join and contribute to established research groups at the leading edge of their discipline. This is often facilitated by various staff development and support programs. Given that academics are often appointed primarily on the basis of their research skills and outputs, such an approach is justified and is likely to result in advancing the individual academic’s career. It also enhances their capacity to attract competitive research funding, while contributing to the overall research performance of their institution, with further potential for an increased share of government funding. In contrast, there is much less clarity of direction or availability of support mechanisms for those academics in their role as teachers. Following a general induction to teaching and learning at their institution, they would commonly think about preparing some lecture materials, whether for delivery in a face-to-face or on-line modality. Typically they would look for new references and textbooks to act as a guide for preparing the content. They would probably find out how the course has been taught before, and what laboratory facilities and experiments have been used. In all of these and other related tasks, the majority of newly appointed academics are guided strongly by their own experiences as students, rather than any firm knowledge of pedagogical principles. At a time of increased demands on academics’ time, and high expectations of performance and productivity in both research and teaching, it is essential to examine possible actions to support academics in enhancing their teaching performance in effective and efficient ways. Many resources have been produced over the years in engineering schools around the world, with very high intellectual and monetary costs. In Australia, the last few years have seen a surge in the number of ALTC/OLT projects and fellowships addressing a range of engineering education issues and providing many resources. There are concerns however regarding the extent to which these resources are being effectively utilised. Why are academics still re-inventing the wheel and creating their own version of teaching resources and pedagogical practice? Why do they spend so much of their precious time in such an inefficient way? A symposium examining the above issues was conducted at the AAEE2012 conference, and some pointers to possible responses to the above questions were obtained. These are explored in this paper and supplemented by the responses to a survey of a group of engineering education leaders on some of the aspects of these research questions. The outcomes of the workshop and survey results have been analysed in view of the literature and the ALTC/OLT sponsored learning and teaching projects and resources. Other factors are discussed, including how such resources can be found, how their quality might be evaluated, and how assessment may be appropriately incorporated, again using readily available resources. This study found a strong resonance between resources reuse with work on technology acceptance (Davis, 1989), suggesting that technology adoption models could be used to encourage resource sharing. Efficient use of outstanding learning materials is an enabling approach. The paper provides some insights on the factors affecting the re-use of available resources, and makes some recommendations and suggestions on how the issue of resources re-use might be incorporated in the process of applying and completing engineering education projects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Between 2001 and 2005, the US airline industry faced financial turmoil while the European airline industry entered a period of substantive deregulation. Consequently, this opened up opportunities for low-cost carriers to become more competitive in the market. To assess airline performance and identify the sources of efficiency in the immediate aftermath of these events, we employ a bootstrap data envelopment analysis truncated regression approach. The results suggest that at the time the mainstream airlines needed to significantly reorganize and rescale their operations to remain competitive. In the second-stage analysis, the results indicate that private ownership, status as a low-cost carrier, and improvements in weight load contributed to better organizational efficiency.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Digital human modeling (DHM) systems underwent significant development within the last years. They achieved constantly growing importance in the field of ergonomic workplace design, product development, product usability, ergonomic research, ergonomic education, audiovisual marketing and the entertainment industry. They help to design ergonomic products as well as healthy and safe socio-technical work systems. In the domain of scientific DHM systems, no industry specific standard interfaces are defined which could facilitate the exchange of 3D solid body data, anthropometric data or motion data. The focus of this article is to provide an overview of requirements for a reliable data exchange between different DHM systems in order to identify suitable file formats. Examples from the literature are discussed in detail. Methods: As a first step a literature review is conducted on existing studies and file formats for exchanging data between different DHM systems. The found file formats can be structured into different categories: static 3D solid body data exchange, anthropometric data exchange, motion data exchange and comprehensive data exchange. Each file format is discussed and advantages as well as disadvantages for the DHM context are pointed out. Case studies are furthermore presented, which show first approaches to exchange data between DHM systems. Lessons learnt are shortly summarized. Results: A selection of suitable file formats for data exchange between DHM systems is determined from the literature review.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Exposure control or case-control methodologies are common techniques for estimating crash risks, however they require either observational data on control cases or exogenous exposure data, such as vehicle-kilometres travelled. This study proposes an alternative methodology for estimating crash risk of road user groups, whilst controlling for exposure under a variety of roadway, traffic and environmental factors by using readily available police-reported crash data. In particular, the proposed method employs a combination of a log-linear model and quasi-induced exposure technique to identify significant interactions among a range of roadway, environmental and traffic conditions to estimate associated crash risks. The proposed methodology is illustrated using a set of police-reported crash data from January 2004 to June 2009 on roadways in Queensland, Australia. Exposure-controlled crash risks of motorcyclists—involved in multi-vehicle crashes at intersections—were estimated under various combinations of variables like posted speed limit, intersection control type, intersection configuration, and lighting condition. Results show that the crash risk of motorcycles at three-legged intersections is high if the posted speed limits along the approaches are greater than 60 km/h. The crash risk at three-legged intersections is also high when they are unsignalized. Dark lighting conditions appear to increase the crash risk of motorcycles at signalized intersections, but the problem of night time conspicuity of motorcyclists at intersections is lessened on approaches with lower speed limits. This study demonstrates that this combined methodology is a promising tool for gaining new insights into the crash risks of road user groups, and is transferrable to other road users.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: Changes in pupil size and shape are relevant for peripheral imagery by affecting aberrations and how much light enters and/or exits the eye. The purpose of this study is to model the pattern of pupil shape across the complete horizontal visual field and to show how the pattern is influenced by refractive error. Methods: Right eyes of thirty participants were dilated with 1% cyclopentolate and images were captured using a modified COAS-HD aberrometer alignment camera along the horizontal visual field to ±90°. A two lens relay system enabled fixation at targets mounted on the wall 3m from the eye. Participants placed their heads on a rotatable chin rest and eye rotations were kept to less than 30°. Best-fit elliptical dimensions of pupils were determined. Ratios of minimum to maximum axis diameters were plotted against visual field angle. Results: Participants’ data were well fitted by cosine functions, with maxima at (–)1° to (–)9° in the temporal visual field and widths 9% to 15% greater than predicted by the cosine of the field angle . Mean functions were 0.99cos[( + 5.3)/1.121], R2 0.99 for the whole group and 0.99cos[( + 6.2)/1.126], R2 0.99 for the 13 emmetropes. The function peak became less temporal, and the width became smaller, with increase in myopia. Conclusion: Off-axis pupil shape changes are well described by a cosine function which is both decentered by a few degrees and flatter by about 12% than the cosine of the viewing angle, with minor influences of refraction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study considered the problem of predicting survival, based on three alternative models: a single Weibull, a mixture of Weibulls and a cure model. Instead of the common procedure of choosing a single “best” model, where “best” is defined in terms of goodness of fit to the data, a Bayesian model averaging (BMA) approach was adopted to account for model uncertainty. This was illustrated using a case study in which the aim was the description of lymphoma cancer survival with covariates given by phenotypes and gene expression. The results of this study indicate that if the sample size is sufficiently large, one of the three models emerge as having highest probability given the data, as indicated by the goodness of fit measure; the Bayesian information criterion (BIC). However, when the sample size was reduced, no single model was revealed as “best”, suggesting that a BMA approach would be appropriate. Although a BMA approach can compromise on goodness of fit to the data (when compared to the true model), it can provide robust predictions and facilitate more detailed investigation of the relationships between gene expression and patient survival. Keywords: Bayesian modelling; Bayesian model averaging; Cure model; Markov Chain Monte Carlo; Mixture model; Survival analysis; Weibull distribution

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Bluetooth technology is being increasingly used, among the Automated Vehicle Identification Systems, to retrieve important information about urban networks. Because the movement of Bluetooth-equipped vehicles can be monitored, throughout the network of Bluetooth sensors, this technology represents an effective means to acquire accurate time dependant Origin Destination information. In order to obtain reliable estimations, however, a number of issues need to be addressed, through data filtering and correction techniques. Some of the main challenges inherent to Bluetooth data are, first, that Bluetooth sensors may fail to detect all of the nearby Bluetooth-enabled vehicles. As a consequence, the exact journey for some vehicles may become a latent pattern that will need to be estimated. Second, sensors that are in close proximity to each other may have overlapping detection areas, thus making the task of retrieving the correct travelled path even more challenging. The aim of this paper is twofold: to give an overview of the issues inherent to the Bluetooth technology, through the analysis of the data available from the Bluetooth sensors in Brisbane; and to propose a method for retrieving the itineraries of the individual Bluetooth vehicles. We argue that estimating these latent itineraries, accurately, is a crucial step toward the retrieval of accurate dynamic Origin Destination Matrices.