906 resultados para Data Interpretation, Statistical
Resumo:
Purpose A knowledge-based urban development needs to be sustainable and, therefore, requires ecological planning strategies to ensure a better quality of its services. The purpose of this paper is to present an innovative approach for monitoring the sustainability of urban services and help the policy-making authorities to revise the current planning and development practices for more effective solutions. The paper introduces a new assessment tool–Micro-level Urban-ecosystem Sustainability IndeX (MUSIX) – that provides a quantitative measure of urban sustainability in a local context. Design/methodology/approach A multi-method research approach was employed in the construction of the MUSIX. A qualitative research was conducted through an interpretive and critical literature review in developing a theoretical framework and indicator selection. A quantitative research was conducted through statistical and spatial analyses in data collection, processing and model application. Findings/results MUSIX was tested in a pilot study site and provided information referring to the main environmental impacts arising from rapid urban development and population growth. Related to that, some key ecological planning strategies were recommended to guide the preparation and assessment of development and local area plans. Research limitations/implications This study provided fundamental information that assists developers, planners and policy-makers to investigate the multidimensional nature of sustainability at the local level by capturing the environmental pressures and their driving forces in highly developed urban areas. Originality/value This study measures the sustainability of urban development plans through providing data analysis and interpretation of results in a new spatial data unit.
Resumo:
During the current (1995-present) eruptive phase of the Soufrière Hills volcano on Montserrat, voluminous pyroclastic flows entered the sea off the eastern flank of the island, resulting in the deposition of well-defined submarine pyroclastic lobes. Previously reported bathymetric surveys documented the sequential construction of these deposits, but could not image their internal structure, the morphology or extent of their base, or interaction with the underlying sediments. We show, by combining these bathymetric data with new high-resolution three dimensional (3D) seismic data, that the sequence of previously detected pyroclastic deposits from different phases of the ongoing eruptive activity is still well preserved. A detailed interpretation of the 3D seismic data reveals the absence of significant (> 3. m) basal erosion in the distal extent of submarine pyroclastic deposits. We also identify a previously unrecognized seismic unit directly beneath the stack of recent lobes. We propose three hypotheses for the origin of this seismic unit, but prefer an interpretation that the deposit is the result of the subaerial flank collapse that formed the English's Crater scarp on the Soufrière Hills volcano. The 1995-recent volcanic activity on Montserrat accounts for a significant portion of the sediments on the southeast slope of Montserrat, in places forming deposits that are more than 60. m thick, which implies that the potential for pyroclastic flows to build volcanic island edifices is significant.
Resumo:
Nitrous oxide emissions from soil are known to be spatially and temporally volatile. Reliable estimation of emissions over a given time and space depends on measuring with sufficient intensity but deciding on the number of measuring stations and the frequency of observation can be vexing. The question of low frequency manual observations providing comparable results to high frequency automated sampling also arises. Data collected from a replicated field experiment was intensively studied with the intention to give some statistically robust guidance on these issues. The experiment had nitrous oxide soil to air flux monitored within 10 m by 2.5 m plots by automated closed chambers under a 3 h average sampling interval and by manual static chambers under a three day average sampling interval over sixty days. Observed trends in flux over time by the static chambers were mostly within the auto chamber bounds of experimental error. Cumulated nitrous oxide emissions as measured by each system were also within error bounds. Under the temporal response pattern in this experiment, no significant loss of information was observed after culling the data to simulate results under various low frequency scenarios. Within the confines of this experiment observations from the manual chambers were not spatially correlated above distances of 1 m. Statistical power was therefore found to improve due to increased replicates per treatment or chambers per replicate. Careful after action review of experimental data can deliver savings for future work.
Resumo:
The promise of ‘big data’ has generated a significant deal of interest in the development of new approaches to research in the humanities and social sciences, as well as a range of important critical interventions which warn of an unquestioned rush to ‘big data’. Drawing on the experiences made in developing innovative ‘big data’ approaches to social media research, this paper examines some of the repercussions for the scholarly research and publication practices of those researchers who do pursue the path of ‘big data’–centric investigation in their work. As researchers import the tools and methods of highly quantitative, statistical analysis from the ‘hard’ sciences into computational, digital humanities research, must they also subscribe to the language and assumptions underlying such ‘scientificity’? If so, how does this affect the choices made in gathering, processing, analysing, and disseminating the outcomes of digital humanities research? In particular, is there a need to rethink the forms and formats of publishing scholarly work in order to enable the rigorous scrutiny and replicability of research outcomes?
Resumo:
This thesis explored the knowledge and reasoning of young children in solving novel statistical problems, and the influence of problem context and design on their solutions. It found that young children's statistical competencies are underestimated, and that problem design and context facilitated children's application of a wide range of knowledge and reasoning skills, none of which had been taught. A qualitative design-based research method, informed by the Models and Modeling perspective (Lesh & Doerr, 2003) underpinned the study. Data modelling activities incorporating picture story books were used to contextualise the problems. Children applied real-world understanding to problem solving, including attribute identification, categorisation and classification skills. Intuitive and metarepresentational knowledge together with inductive and probabilistic reasoning was used to make sense of data, and beginning awareness of statistical variation and informal inference was visible.
Resumo:
Spectroscopic studies of complex clinical fluids have led to the application of a more holistic approach to their chemical analysis becoming more popular and widely employed. The efficient and effective interpretation of multidimensional spectroscopic data relies on many chemometric techniques and one such group of tools is represented by so-called correlation analysis methods. Typical of these techniques are two-dimensional correlation analysis and statistical total correlation spectroscopy (STOCSY). Whilst the former has largely been applied to optical spectroscopic analysis, STOCSY was developed and has been applied almost exclusively to NMR metabonomic studies. Using a 1H NMR study of human blood plasma, from subjects recovering from exhaustive exercise trials, the basic concepts and applications of these techniques are examined. Typical information from their application to NMR-based metabonomics is presented and their value in aiding interpretation of NMR data obtained from biological systems is illustrated. Major energy metabolites are identified in the NMR spectra and the dynamics of their appearance and removal from plasma during exercise recovery are illustrated and discussed. The complementary nature of two-dimensional correlation analysis and statistical total correlation spectroscopy are highlighted.
Resumo:
Background The implementation of the Australian Consumer Law in 2011 highlighted the need for better use of injury data to improve the effectiveness and responsiveness of product safety (PS) initiatives. In the PS system, resources are allocated to different priority issues using risk assessment tools. The rapid exchange of information (RAPEX) tool to prioritise hazards, developed by the European Commission, is currently being adopted in Australia. Injury data is required as a basic input to the RAPEX tool in the risk assessment process. One of the challenges in utilising injury data in the PS system is the complexity of translating detailed clinical coded data into broad categories such as those used in the RAPEX tool. Aims This study aims to translate hospital burns data into a simplified format by mapping the International Statistical Classification of Disease and Related Health Problems (Tenth Revision) Australian Modification (ICD-10-AM) burn codes into RAPEX severity rankings, using these rankings to identify priority areas in childhood product-related burns data. Methods ICD-10-AM burn codes were mapped into four levels of severity using the RAPEX guide table by assigning rankings from 1-4, in order of increasing severity. RAPEX rankings were determined by the thickness and surface area of the burn (BSA) with information extracted from the fourth character of T20-T30 codes for burn thickness, and the fourth and fifth characters of T31 codes for the BSA. Following the mapping process, secondary data analysis of 2008-2010 Queensland Hospital Admitted Patient Data Collection (QHAPDC) paediatric data was conducted to identify priority areas in product-related burns. Results The application of RAPEX rankings in QHAPDC burn data showed approximately 70% of paediatric burns in Queensland hospitals were categorised under RAPEX levels 1 and 2, 25% under RAPEX 3 and 4, with the remaining 5% unclassifiable. In the PS system, prioritisations are made to issues categorised under RAPEX levels 3 and 4. Analysis of external cause codes within these levels showed that flammable materials (for children aged 10-15yo) and hot substances (for children aged <2yo) were the most frequently identified products. Discussion and conclusions The mapping of ICD-10-AM burn codes into RAPEX rankings showed a favourable degree of compatibility between both classification systems, suggesting that ICD-10-AM coded burn data can be simplified to more effectively support PS initiatives. Additionally, the secondary data analysis showed that only 25% of all admitted burn cases in Queensland were severe enough to trigger a PS response.
Resumo:
Background: The randomised phase 3 First-Line Erbitux in Lung Cancer (FLEX) study showed that the addition of cetuximab to cisplatin and vinorelbine significantly improved overall survival compared with chemotherapy alone in the first-line treatment of advanced non-small-cell lung cancer (NSCLC). The main cetuximab-related side-effect was acne-like rash. Here, we assessed the association of this acne-like rash with clinical benefit. Methods: We did a subgroup analysis of patients in the FLEX study, which enrolled patients with advanced NSCLC whose tumours expressed epidermal growth factor receptor. Our landmark analysis assessed if the development of acne-like rash in the first 21 days of treatment (first-cycle rash) was associated with clinical outcome, on the basis of patients in the intention-to-treat population alive on day 21. The FLEX study is registered with ClinicalTrials.gov, number NCT00148798. Findings: 518 patients in the chemotherapy plus cetuximab group-290 of whom had first-cycle rash-and 540 patients in the chemotherapy alone group were alive on day 21. Patients in the chemotherapy plus cetuximab group with first-cycle rash had significantly prolonged overall survival compared with patients in the same treatment group without first-cycle rash (median 15·0 months [95% CI 12·8-16·4] vs 8·8 months [7·6-11·1]; hazard ratio [HR] 0·631 [0·515-0·774]; p<0·0001). Corresponding significant associations were also noted for progression-free survival (median 5·4 months [5·2-5·7] vs 4·3 months [4·1-5·3]; HR 0·741 [0·607-0·905]; p=0·0031) and response (rate 44·8% [39·0-50·8] vs 32·0% [26·0-38·5]; odds ratio 1·703 [1·186-2·448]; p=0·0039). Overall survival for patients without first-cycle rash was similar to that of patients that received chemotherapy alone (median 8·8 months [7·6-11·1] vs 10·3 months [9·6-11·3]; HR 1·085 [0·910-1·293]; p=0·36). The significant overall survival benefit for patients with first-cycle rash versus without was seen in all histology subgroups: adenocarcinoma (median 16·9 months, [14·1-20·6] vs 9·3 months [7·7-13·2]; HR 0·614 [0·453-0·832]; p=0·0015), squamous-cell carcinoma (median 13·2 months [10·6-16·0] vs 8·1 months [6·7-12·6]; HR 0·659 [0·472-0·921]; p=0·014), and carcinomas of other histology (median 12·6 months [9·2-16·4] vs 6·9 months [5·2-11·0]; HR 0·616 [0·392-0·966]; p=0·033). Interpretation: First-cycle rash was associated with a better outcome in patients with advanced NSCLC who received cisplatin and vinorelbine plus cetuximab as a first-line treatment. First-cycle rash might be a surrogate clinical marker that could be used to tailor cetuximab treatment for advanced NSCLC to those patients who would be most likely to derive a significant benefit. Funding: Merck KGaA. © 2011 Elsevier Ltd.
Resumo:
Background The implementation of the Australian Consumer Law in 2011 highlighted the need for better use of injury data to improve the effectiveness and responsiveness of product safety (PS) initiatives. In the PS system, resources are allocated to different priority issues using risk assessment tools. The rapid exchange of information (RAPEX) tool to prioritise hazards, developed by the European Commission, is currently being adopted in Australia. Injury data is required as a basic input to the RAPEX tool in the risk assessment process. One of the challenges in utilising injury data in the PS system is the complexity of translating detailed clinical coded data into broad categories such as those used in the RAPEX tool. Aims This study aims to translate hospital burns data into a simplified format by mapping the International Statistical Classification of Disease and Related Health Problems (Tenth Revision) Australian Modification (ICD-10-AM) burn codes into RAPEX severity rankings, using these rankings to identify priority areas in childhood product-related burns data. Methods ICD-10-AM burn codes were mapped into four levels of severity using the RAPEX guide table by assigning rankings from 1-4, in order of increasing severity. RAPEX rankings were determined by the thickness and surface area of the burn (BSA) with information extracted from the fourth character of T20-T30 codes for burn thickness, and the fourth and fifth characters of T31 codes for the BSA. Following the mapping process, secondary data analysis of 2008-2010 Queensland Hospital Admitted Patient Data Collection (QHAPDC) paediatric data was conducted to identify priority areas in product-related burns. Results The application of RAPEX rankings in QHAPDC burn data showed approximately 70% of paediatric burns in Queensland hospitals were categorised under RAPEX levels 1 and 2, 25% under RAPEX 3 and 4, with the remaining 5% unclassifiable. In the PS system, prioritisations are made to issues categorised under RAPEX levels 3 and 4. Analysis of external cause codes within these levels showed that flammable materials (for children aged 10-15yo) and hot substances (for children aged <2yo) were the most frequently identified products. Discussion and conclusions The mapping of ICD-10-AM burn codes into RAPEX rankings showed a favourable degree of compatibility between both classification systems, suggesting that ICD-10-AM coded burn data can be simplified to more effectively support PS initiatives. Additionally, the secondary data analysis showed that only 25% of all admitted burn cases in Queensland were severe enough to trigger a PS response.
Resumo:
Background: Findings from the phase 3 FLEX study showed that the addition of cetuximab to cisplatin and vinorelbine significantly improved overall survival, compared with cisplatin and vinorelbine alone, in the first-line treatment of EGFR-expressing, advanced non-small-cell lung cancer (NSCLC). We investigated whether candidate biomarkers were predictive for the efficacy of chemotherapy plus cetuximab in this setting. Methods: Genomic DNA extracted from formalin-fixed paraffin-embedded (FFPE) tumour tissue of patients enrolled in the FLEX study was screened for KRAS codon 12 and 13 and EGFR kinase domain mutations with PCR-based assays. In FFPE tissue sections, EGFR copy number was assessed by dual-colour fluorescence in-situ hybridisation and PTEN expression by immunohistochemistry. Treatment outcome was investigated according to biomarker status in all available samples from patients in the intention-to-treat population. The primary endpoint in the FLEX study was overall survival. The FLEX study, which is ongoing but not recruiting participants, is registered with ClinicalTrials.gov, number NCT00148798. Findings: KRAS mutations were detected in 75 of 395 (19%) tumours and activating EGFR mutations in 64 of 436 (15%). EGFR copy number was scored as increased in 102 of 279 (37%) tumours and PTEN expression as negative in 107 of 303 (35%). Comparisons of treatment outcome between the two groups (chemotherapy plus cetuximab vs chemotherapy alone) according to biomarker status provided no indication that these biomarkers were of predictive value. Activating EGFR mutations were identified as indicators of good prognosis, with patients in both treatment groups whose tumours carried such mutations having improved survival compared with those whose tumours did not (chemotherapy plus cetuximab: median 17·5 months [95% CI 11·7-23·4] vs 8·5 months [7·1-10·8], hazard ratio [HR] 0·52 [0·32-0·84], p=0·0063; chemotherapy alone: 23·8 months [15·2-not reached] vs 10·0 months [8·7-11·0], HR 0·35 [0·21-0·59], p<0·0001). Expression of PTEN seemed to be a potential indicator of good prognosis, with patients whose tumours expressed PTEN having improved survival compared with those whose tumours did not, although this finding was not significant (chemotherapy plus cetuximab: median 11·4 months [8·6-13·6] vs 6·8 months [5·9-12·7], HR 0·80 [0·55-1·16], p=0·24; chemotherapy alone: 11·0 months [9·2-12·6] vs 9·3 months [7·6-11·9], HR 0·77 [0·54-1·10], p=0·16). Interpretation: The efficacy of chemotherapy plus cetuximab in the first-line treatment of advanced NSCLC seems to be independent of each of the biomarkers assessed. Funding: Merck KGaA. © 2011 Elsevier Ltd.
Resumo:
Background: Findings from the phase 3 First-Line ErbituX in lung cancer (FLEX) study showed that the addition of cetuximab to first-line chemotherapy significantly improved overall survival compared with chemotherapy alone (hazard ratio [HR] 0·871, 95% CI 0·762-0·996; p=0·044) in patients with advanced non-small-cell lung cancer (NSCLC). To define patients benefiting most from cetuximab, we studied the association of tumour EGFR expression level with clinical outcome in FLEX study patients. Methods: We used prospectively collected tumour EGFR expression data to generate an immunohistochemistry score for FLEX study patients on a continuous scale of 0-300. We used response data to select an outcome-based discriminatory threshold immunohistochemistry score for EGFR expression of 200. Treatment outcome was analysed in patients with low (immunohistochemistry score <200) and high (≥200) tumour EGFR expression. The primary endpoint in the FLEX study was overall survival. We analysed patients from the FLEX intention-to-treat (ITT) population. The FLEX study is registered with ClinicalTrials.gov, number NCT00148798. Findings: Tumour EGFR immunohistochemistry data were available for 1121 of 1125 (99·6%) patients from the FLEX study ITT population. High EGFR expression was scored for 345 (31%) evaluable patients and low for 776 (69%) patients. For patients in the high EGFR expression group, overall survival was longer in the chemotherapy plus cetuximab group than in the chemotherapy alone group (median 12·0 months [95% CI 10·2-15·2] vs 9·6 months [7·6-10·6]; HR 0·73, 0·58-0·93; p=0·011), with no meaningful increase in side-effects. We recorded no corresponding survival benefit for patients in the low EGFR expression group (median 9·8 months [8·9-12·2] vs 10·3 months [9·2-11·5]; HR 0·99, 0·84-1·16; p=0·88). A treatment interaction test assessing the difference in the HRs for overall survival between the EGFR expression groups suggested a predictive value for EGFR expression (p=0·044). Interpretation: High EGFR expression is a tumour biomarker that can predict survival benefit from the addition of cetuximab to first-line chemotherapy in patients with advanced NSCLC. Assessment of EGFR expression could offer a personalised treatment approach in this setting. Funding: Merck KGaA. © 2012 Elsevier Ltd.
Resumo:
Spatial organisation of proteins according to their function plays an important role in the specificity of their molecular interactions. Emerging proteomics methods seek to assign proteins to sub-cellular locations by partial separation of organelles and computational analysis of protein abundance distributions among partially separated fractions. Such methods permit simultaneous analysis of unpurified organelles and promise proteome-wide localisation in scenarios wherein perturbation may prompt dynamic re-distribution. Resolving organelles that display similar behavior during a protocol designed to provide partial enrichment represents a possible shortcoming. We employ the Localisation of Organelle Proteins by Isotope Tagging (LOPIT) organelle proteomics platform to demonstrate that combining information from distinct separations of the same material can improve organelle resolution and assignment of proteins to sub-cellular locations. Two previously published experiments, whose distinct gradients are alone unable to fully resolve six known protein-organelle groupings, are subjected to a rigorous analysis to assess protein-organelle association via a contemporary pattern recognition algorithm. Upon straightforward combination of single-gradient data, we observe significant improvement in protein-organelle association via both a non-linear support vector machine algorithm and partial least-squares discriminant analysis. The outcome yields suggestions for further improvements to present organelle proteomics platforms, and a robust analytical methodology via which to associate proteins with sub-cellular organelles.
Resumo:
Talk of Big Data seems to be everywhere. Indeed, the apparently value-free concept of ‘data’ has seen a spectacular broadening of popular interest, shifting from the dry terminology of labcoat-wearing scientists to the buzzword du jour of marketers. In the business world, data is increasingly framed as an economic asset of critical importance, a commodity on a par with scarce natural resources (Backaitis, 2012; Rotella, 2012). It is social media that has most visibly brought the Big Data moment to media and communication studies, and beyond it, to the social sciences and humanities. Social media data is one of the most important areas of the rapidly growing data market (Manovich, 2012; Steele, 2011). Massive valuations are attached to companies that directly collect and profit from social media data, such as Facebook and Twitter, as well as to resellers and analytics companies like Gnip and DataSift. The expectation attached to the business models of these companies is that their privileged access to data and the resulting valuable insights into the minds of consumers and voters will make them irreplaceable in the future. Analysts and consultants argue that advanced statistical techniques will allow the detection of ongoing communicative events (natural disasters, political uprisings) and the reliable prediction of future ones (electoral choices, consumption)...
Resumo:
Public health research consistently demonstrates the salience of neighbourhood as a determinant of both health-related behaviours and outcomes across the human life course. This paper will report on the findings from a mixed-methods Brisbane-based study that explores how mothers with primary school children from both high and low socioeconomic suburbs use the local urban environment for the purpose of physical activity. Firstly, we demonstrate findings from an innovative methodology using the geographic information systems (GIS) embedded in social media platforms on mobile phones to track locations, resource-use, distances travelled, and modes of transport of the families in real-time; and secondly, we report on qualitative data that provides insight into reasons for differential use of the environment by both groups. Spatial/mapping and statistical data showed that while the mothers from both groups demonstrated similar daily routines, the mothers from the high SEP suburb engaged in increased levels of physical activity, travelled less frequently and less distance by car, and walked more for transport. The qualitative data revealed differences in the psychosocial processes and characteristics of the households and neighbourhoods of the respective groups, with mothers in the lower SEP suburb reporting more stress, higher conflict, and lower quality relationships with neighbours.
Resumo:
The Department of Culture and the Arts undertook the first mapping of Perth’s creative industries in 2007 in partnership with the City of Perth and the Departments of Industry and Resources and the Premier and Cabinet. The 2013 Creative Industries Statistical Analysis for Western Australia report has updated the mapping with the 2011 Census employment data to provide invaluable information for the State’s creative industries, their peak associations and potential investors. The report maps sector employment numbers and growth between the 2006 and 2011 Census in the areas of music, visual and performing arts, film, TV and radio, advertising and marketing, software and digital content, publishing, and architecture and design, which includes designer fashion.