55 resultados para Multivariate data analysis


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Recent technological advances have increased the quantity of movement data being recorded. While valuable knowledge can be gained by analysing such data, its sheer volume creates challenges. Geovisual analytics, which helps the human cognition process by using tools to reason about data, offers powerful techniques to resolve these challenges. This paper introduces such a geovisual analytics environment for exploring movement trajectories, which provides visualisation interfaces, based on the classic space-time cube. Additionally, a new approach, using the mathematical description of motion within a space-time cube, is used to determine the similarity of trajectories and forms the basis for clustering them. These techniques were used to analyse pedestrian movement. The results reveal interesting and useful spatiotemporal patterns and clusters of pedestrians exhibiting similar behaviour.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: Patient reported outcome measures (PROMs) are used to evaluate lifestyle interventions but littleis known about differences between patients returning valid and invalid responses, or of potential for bias inevaluations. We aimed to examine the characteristics of patients who returned valid responses to lifestylequestionnaires compared to those whose responses were invalid for evaluating lifestyle change. 
Methods: We conducted a secondary data analysis from the SPHERE Study, a trial of an intervention to improveoutcomes for patients with coronary heart disease in primary care. Postal questionnaires were used to assessphysical activity (Godin) and diet (DINE) among study participants at baseline and 18 month follow-up. Three binaryresponse variables were generated for analysis: (1) valid Godin score; (2) valid DINE Fibre score; and (3) validDINE Total Fat score. Multivariate analysis comprised generalised estimating equation regression to examine theassociation of patients’ characteristics with their return of valid responses at both timepoints. 
Results: Overall, 92.1% of participants (832/903) returned questionnaires at both baseline and 18 months. Relativelyfewer valid Godin scores were returned by those who left school aged <15 years (36.5%) than aged 18 and over(50.5%), manual workers (39.5%) than non-manual (49.5%) and those with an elevated cholesterol (>5 mmol)(34.7%) than those with a lower cholesterol (44.4%) but multivariate analysis identified that only school leaving age(p = 0.047) was of statistical significance.Relatively fewer valid DINE scores were returned by manual than non-manual workers (fibre: 80.8% v 86.8%;fat: 71.2% v 80.0%), smokers (fibre: 72.6% v 84.7%; fat: 67.5% v 76.9%), patients with diabetes (fibre: 75.9% v 82.9%;fat: 66.9% v 75.8%) and those with cholesterol >5 mmol (fat: 68.2% v 76.2%) but multivariate analysis showedstatistical significance only for smoking (fibre: p = 0.013; fat: p = 0.045), diabetes (fibre: p = 0.039; fat: p = 0.047), andcholesterol (fat: p = 0.039). 
Conclusions: Our findings illustrate the importance of detailed reporting of research methods, with clearinformation about response rates, respondents and valid outcome data. Outcome measures which are relevant to astudy population should be chosen carefully. The impact of methods of outcome measurement and valid responserates in evaluating healthcare requires further study.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We describe the career of John Birks as a pioneering scientist who has, over a career spanning five decades, transformed palaeoecology from a largely descriptive to a rigorous quantitative science relevant to contemporary questions in ecology and environmental change. We review his influence on students and colleagues not only at Cambridge and Bergen Universities, his places of primary employment, but also on individuals and research groups in Europe and North America. We also introduce the collection of papers that we have assembled in his honour. The papers are written by his former students and close colleagues and span many of the areas of palaeoecology to which John himself has made major contributions. These include the relationship between ecology and palaeoecology, late-glacial and Holocene palaeoecology, ecological succession, climate change and vegetation history, the role of palaeoecological techniques in reconstructing and understanding the impact of human activity on terrestrial and freshwater ecosystems and numerical analysis of multivariate palaeoecological data.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Context. The Public European Southern Observatory Spectroscopic Survey of Transient Objects (PESSTO) began as a public spectroscopic survey in April 2012. PESSTO classifies transients from publicly available sources and wide-field surveys, and selects science targets for detailed spectroscopic and photometric follow-up. PESSTO runs for nine months of the year, January - April and August - December inclusive, and typically has allocations of 10 nights per month. 

Aims. We describe the data reduction strategy and data products that are publicly available through the ESO archive as the Spectroscopic Survey data release 1 (SSDR1). 

Methods. PESSTO uses the New Technology Telescope with the instruments EFOSC2 and SOFI to provide optical and NIR spectroscopy and imaging. We target supernovae and optical transients brighter than 20.5<sup>m</sup> for classification. Science targets are selected for follow-up based on the PESSTO science goal of extending knowledge of the extremes of the supernova population. We use standard EFOSC2 set-ups providing spectra with resolutions of 13-18 Å between 3345-9995 Å. A subset of the brighter science targets are selected for SOFI spectroscopy with the blue and red grisms (0.935-2.53 μm and resolutions 23-33 Å) and imaging with broadband JHK<inf>s</inf> filters. 

Results. This first data release (SSDR1) contains flux calibrated spectra from the first year (April 2012-2013). A total of 221 confirmed supernovae were classified, and we released calibrated optical spectra and classifications publicly within 24 h of the data being taken (via WISeREP). The data in SSDR1 replace those released spectra. They have more reliable and quantifiable flux calibrations, correction for telluric absorption, and are made available in standard ESO Phase 3 formats. We estimate the absolute accuracy of the flux calibrations for EFOSC2 across the whole survey in SSDR1 to be typically ∼15%, although a number of spectra will have less reliable absolute flux calibration because of weather and slit losses. Acquisition images for each spectrum are available which, in principle, can allow the user to refine the absolute flux calibration. The standard NIR reduction process does not produce high accuracy absolute spectrophotometry but synthetic photometry with accompanying JHK<inf>s</inf> imaging can improve this. Whenever possible, reduced SOFI images are provided to allow this. 

Conclusions. Future data releases will focus on improving the automated flux calibration of the data products. The rapid turnaround between discovery and classification and access to reliable pipeline processed data products has allowed early science papers in the first few months of the survey.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper outlines a forensic method for analysing the energy, environmental and comfort performance of a building. The method has been applied to a recently developed event space in an Irish public building, which was evaluated using on-site field studies, data analysis, building simulation and occupant surveying. The method allows for consideration of both the technological and anthropological aspects of the building in use and for the identification of unsustainable operational practice and emerging problems. The forensic analysis identified energy savings of up to 50%, enabling a more sustainable, lower-energy operational future for the building. The building forensic analysis method presented in this paper is now planned for use in other public and commercial buildings.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Single component geochemical maps are the most basic representation of spatial elemental distributions and commonly used in environmental and exploration geochemistry. However, the compositional nature of geochemical data imposes several limitations on how the data should be presented. The problems relate to the constant sum problem (closure), and the inherently multivariate relative information conveyed by compositional data. Well known is, for instance, the tendency of all heavy metals to show lower values in soils with significant contributions of diluting elements (e.g., the quartz dilution effect); or the contrary effect, apparent enrichment in many elements due to removal of potassium during weathering. The validity of classical single component maps is thus investigated, and reasonable alternatives that honour the compositional character of geochemical concentrations are presented. The first recommended such method relies on knowledge-driven log-ratios, chosen to highlight certain geochemical relations or to filter known artefacts (e.g. dilution with SiO2 or volatiles). This is similar to the classical normalisation approach to a single element. The second approach uses the (so called) log-contrasts, that employ suitable statistical methods (such as classification techniques, regression analysis, principal component analysis, clustering of variables, etc.) to extract potentially interesting geochemical summaries. The caution from this work is that if a compositional approach is not used, it becomes difficult to guarantee that any identified pattern, trend or anomaly is not an artefact of the constant sum constraint. In summary the authors recommend a chain of enquiry that involves searching for the appropriate statistical method that can answer the required geological or geochemical question whilst maintaining the integrity of the compositional nature of the data. The required log-ratio transformations should be applied followed by the chosen statistical method. Interpreting the results may require a closer working relationship between statisticians, data analysts and geochemists.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Perfect information is seldom available to man or machines due to uncertainties inherent in real world problems. Uncertainties in geographic information systems (GIS) stem from either vague/ambiguous or imprecise/inaccurate/incomplete information and it is necessary for GIS to develop tools and techniques to manage these uncertainties. There is a widespread agreement in the GIS community that although GIS has the potential to support a wide range of spatial data analysis problems, this potential is often hindered by the lack of consistency and uniformity. Uncertainties come in many shapes and forms, and processing uncertain spatial data requires a practical taxonomy to aid decision makers in choosing the most suitable data modeling and analysis method. In this paper, we: (1) review important developments in handling uncertainties when working with spatial data and GIS applications; (2) propose a taxonomy of models for dealing with uncertainties in GIS; and (3) identify current challenges and future research directions in spatial data analysis and GIS for managing uncertainties.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A first stage collision database is assembled which contains electron-impact excitation, ionization,\r and recombination rate coefficients for B, B + , B 2+ , B 3+ , and B 4+ . The first stage database\r is constructed using the R-matrix with pseudostates, time-dependent close-coupling, and perturbative\r distorted-wave methods. A second stage collision database is then assembled which contains\r generalized collisional-radiative ionization, recombination, and power loss rate coefficients as a\r function of both temperature and density. The second stage database is constructed by solution of\r the collisional-radiative equations in the quasi-static equilibrium approximation using the first\r stage database. Both collision database stages reside in electronic form at the IAEA Labeled Atomic\r Data Interface (ALADDIN) database and the Atomic Data Analysis Structure (ADAS) open database.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Emerging web applications like cloud computing, Big Data and social networks have created the need for powerful centres hosting hundreds of thousands of servers. Currently, the data centres are based on general purpose processors that provide high flexibility buts lack the energy efficiency of customized accelerators. VINEYARD aims to develop an integrated platform for energy-efficient data centres based on new servers with novel, coarse-grain and fine-grain, programmable hardware accelerators. It will, also, build a high-level programming framework for allowing end-users to seamlessly utilize these accelerators in heterogeneous computing systems by employing typical data-centre programming frameworks (e.g. MapReduce, Storm, Spark, etc.). This programming framework will, further, allow the hardware accelerators to be swapped in and out of the heterogeneous infrastructure so as to offer high flexibility and energy efficiency. VINEYARD will foster the expansion of the soft-IP core industry, currently limited in the embedded systems, to the data-centre market. VINEYARD plans to demonstrate the advantages of its approach in three real use-cases (a) a bio-informatics application for high-accuracy brain modeling, (b) two critical financial applications, and (c) a big-data analysis application.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objective: To determine the risk indicators associated with root caries experience in a cohort of independently living older adults in Ireland.
Methods: The data reported in the present study were obtained from a prospective longitudinal study conducted on the risk factors associated with root caries incidence in a cohort of independently living older adults (n=334). Each subject underwent an oral examination, performed by a single calibrated examiner, to determine the root caries index and other clinical variables. Questionnaires were used to collect data on oral hygiene habits, diet, smoking and alcohol habits and education level. A regression analysis with the outcome variable of root caries experience (no/yes) was conducted.
Results: A total of 334 older adults with a mean age of 69.1 years were examined. 53.3% had at least one filled or decayed root surface. The median root caries index was 3.13 (IQR 0.00, 13.92). The results from the multivariate regression analysis indicated that individuals with poor plaque control (OR 9.59, 95%CI 3.84-24.00), xerostomia (OR 18.49, 95%CI 2.00-172.80), two or more teeth with coronal decay (OR 4.50, 95% CI 2.02-10.02) and 37 or more exposed root surfaces (OR 5.48, 95% CI 2.49-12.01) were more likely to have been affected by root caries.
Conclusions: The prevalence of root caries was high in this cohort. This study suggests a correlation between root caries and the variables poor plaque control, xerostomia, coronal decay (≥2 teeth affected) and exposed root surfaces (≥37). The significance of these risk indicators and the resulting prediction model should be further evaluated in a prospective study of root caries incidence.
Clinical Significance: Identification of risk indicators for root caries in independently living older adults would facilitate dental practitioners to identify those who would benefit most from interventions aimed at prevention.