42 resultados para INTEGRATIVE DATA-ANALYSIS


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Context. Comet 67P/Churyumov-Gerasimenko is the target of the European Space Agency Rosetta spacecraft rendez-vous mission. Detailed physical characteristation of the comet before arrival is important for mission planning as well as providing a test bed for ground-based observing and data-analysis methods. Aims: To conduct a long-term observational programme to characterize the physical properties of the nucleus of the comet, via ground-based optical photometry, and to combine our new data with all available nucleus data from the literature. Methods: We applied aperture photometry techniques on our imaging data and combined the extracted rotational lightcurves with data from the literature. Optical lightcurve inversion techniques were applied to constrain the spin state of the nucleus and its broad shape. We performed a detailed surface thermal analysis with the shape model and optical photometry by incorporating both into the new Advanced Thermophysical Model (ATPM), along with all available Spitzer 8-24 μm thermal-IR flux measurements from the literature. Results: A convex triangular-facet shape model was determined with axial ratios b/a = 1.239 and c/a = 0.819. These values can vary by as much as 7% in each axis and still result in a statistically significant fit to the observational data. Our best spin state solution has Psid = 12.76137 ± 0.00006 h, and a rotational pole orientated at Ecliptic coordinates λ = 78°(±10°), β = + 58°(±10°). The nucleus phase darkening behaviour was measured and best characterized using the IAU HG system. Best fit parameters are: G = 0.11 ± 0.12 and HR(1,1,0) = 15.31 ± 0.07. Our shape model combined with the ATPM can satisfactorily reconcile all optical and thermal-IR data, with the fit to the Spitzer 24 μm data taken in February 2004 being exceptionally good. We derive a range of mutually-consistent physical parameters for each thermal-IR data set, including effective radius, geometric albedo, surface thermal inertia and roughness fraction. Conclusions: The overall nucleus dimensions are well constrained and strongly imply a broad nucleus shape more akin to comet 9P/Tempel 1, rather than the highly elongated or "bi-lobed" nuclei seen for comets 103P/Hartley 2 or 8P/Tuttle. The derived low thermal inertia of

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Recent technological advances have increased the quantity of movement data being recorded. While valuable knowledge can be gained by analysing such data, its sheer volume creates challenges. Geovisual analytics, which helps the human cognition process by using tools to reason about data, offers powerful techniques to resolve these challenges. This paper introduces such a geovisual analytics environment for exploring movement trajectories, which provides visualisation interfaces, based on the classic space-time cube. Additionally, a new approach, using the mathematical description of motion within a space-time cube, is used to determine the similarity of trajectories and forms the basis for clustering them. These techniques were used to analyse pedestrian movement. The results reveal interesting and useful spatiotemporal patterns and clusters of pedestrians exhibiting similar behaviour.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Research over the past two decades on the Holocene sediments from the tide dominated west side of the lower Ganges delta has focussed on constraining the sedimentary environment through grain size distributions (GSD). GSD has traditionally been assessed through the use of probability density function (PDF) methods (e.g. log-normal, log skew-Laplace functions), but these approaches do not acknowledge the compositional nature of the data, which may compromise outcomes in lithofacies interpretations. The use of PDF approaches in GSD analysis poses a series of challenges for the development of lithofacies models, such as equifinal distribution coefficients and obscuring the empirical data variability. In this study a methodological framework for characterising GSD is presented through compositional data analysis (CODA) plus a multivariate statistical framework. This provides a statistically robust analysis of the fine tidal estuary sediments from the West Bengal Sundarbans, relative to alternative PDF approaches.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Context. The Public European Southern Observatory Spectroscopic Survey of Transient Objects (PESSTO) began as a public spectroscopic survey in April 2012. PESSTO classifies transients from publicly available sources and wide-field surveys, and selects science targets for detailed spectroscopic and photometric follow-up. PESSTO runs for nine months of the year, January - April and August - December inclusive, and typically has allocations of 10 nights per month. 

Aims. We describe the data reduction strategy and data products that are publicly available through the ESO archive as the Spectroscopic Survey data release 1 (SSDR1). 

Methods. PESSTO uses the New Technology Telescope with the instruments EFOSC2 and SOFI to provide optical and NIR spectroscopy and imaging. We target supernovae and optical transients brighter than 20.5<sup>m</sup> for classification. Science targets are selected for follow-up based on the PESSTO science goal of extending knowledge of the extremes of the supernova population. We use standard EFOSC2 set-ups providing spectra with resolutions of 13-18 Å between 3345-9995 Å. A subset of the brighter science targets are selected for SOFI spectroscopy with the blue and red grisms (0.935-2.53 μm and resolutions 23-33 Å) and imaging with broadband JHK<inf>s</inf> filters. 

Results. This first data release (SSDR1) contains flux calibrated spectra from the first year (April 2012-2013). A total of 221 confirmed supernovae were classified, and we released calibrated optical spectra and classifications publicly within 24 h of the data being taken (via WISeREP). The data in SSDR1 replace those released spectra. They have more reliable and quantifiable flux calibrations, correction for telluric absorption, and are made available in standard ESO Phase 3 formats. We estimate the absolute accuracy of the flux calibrations for EFOSC2 across the whole survey in SSDR1 to be typically ∼15%, although a number of spectra will have less reliable absolute flux calibration because of weather and slit losses. Acquisition images for each spectrum are available which, in principle, can allow the user to refine the absolute flux calibration. The standard NIR reduction process does not produce high accuracy absolute spectrophotometry but synthetic photometry with accompanying JHK<inf>s</inf> imaging can improve this. Whenever possible, reduced SOFI images are provided to allow this. 

Conclusions. Future data releases will focus on improving the automated flux calibration of the data products. The rapid turnaround between discovery and classification and access to reliable pipeline processed data products has allowed early science papers in the first few months of the survey.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper outlines a forensic method for analysing the energy, environmental and comfort performance of a building. The method has been applied to a recently developed event space in an Irish public building, which was evaluated using on-site field studies, data analysis, building simulation and occupant surveying. The method allows for consideration of both the technological and anthropological aspects of the building in use and for the identification of unsustainable operational practice and emerging problems. The forensic analysis identified energy savings of up to 50%, enabling a more sustainable, lower-energy operational future for the building. The building forensic analysis method presented in this paper is now planned for use in other public and commercial buildings.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Perfect information is seldom available to man or machines due to uncertainties inherent in real world problems. Uncertainties in geographic information systems (GIS) stem from either vague/ambiguous or imprecise/inaccurate/incomplete information and it is necessary for GIS to develop tools and techniques to manage these uncertainties. There is a widespread agreement in the GIS community that although GIS has the potential to support a wide range of spatial data analysis problems, this potential is often hindered by the lack of consistency and uniformity. Uncertainties come in many shapes and forms, and processing uncertain spatial data requires a practical taxonomy to aid decision makers in choosing the most suitable data modeling and analysis method. In this paper, we: (1) review important developments in handling uncertainties when working with spatial data and GIS applications; (2) propose a taxonomy of models for dealing with uncertainties in GIS; and (3) identify current challenges and future research directions in spatial data analysis and GIS for managing uncertainties.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A substantial proportion of aetiological risks for many cancers and chronic diseases remain unexplained. Using geochemical soil and stream water samples collected as part of the Tellus Project studies, current research is investigating naturally occurring background levels of potentially toxic elements (PTEs) in soils and stream sediments and their possible relationship with progressive chronic kidney disease (CKD). The Tellus geological mapping project, Geological Survey Northern Ireland, collected soil sediment and stream water samples on a grid of one sample site every 2 km2 across the rural areas of Northern Ireland resulting in an excess of 6800 soil sampling locations and more than 5800 locations for stream water sampling. Accumulation of several PTEs including arsenic, cadmium, chromium, lead and mercury have been linked with human health and implicated in renal function decline. The hypothesis is that long-term exposure will result in cumulative exposure to PTEs and act as risk factor(s) for cancer and diabetes related CKD and its progression. The ‘bioavailable’ fraction of total PTE soil concentration depends on the ‘bioaccessible’ proportion through an exposure pathway. Recent work has explored this bioaccessible fraction for a range of PTEs across Northern Ireland. In this study the compositional nature of the multivariate geochemical PTE variables and bioaccessible data is explored to augment the investigation into the potential relationship between PTEs, bioaccessibility and disease data.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A first stage collision database is assembled which contains electron-impact excitation, ionization,\r and recombination rate coefficients for B, B + , B 2+ , B 3+ , and B 4+ . The first stage database\r is constructed using the R-matrix with pseudostates, time-dependent close-coupling, and perturbative\r distorted-wave methods. A second stage collision database is then assembled which contains\r generalized collisional-radiative ionization, recombination, and power loss rate coefficients as a\r function of both temperature and density. The second stage database is constructed by solution of\r the collisional-radiative equations in the quasi-static equilibrium approximation using the first\r stage database. Both collision database stages reside in electronic form at the IAEA Labeled Atomic\r Data Interface (ALADDIN) database and the Atomic Data Analysis Structure (ADAS) open database.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Emerging web applications like cloud computing, Big Data and social networks have created the need for powerful centres hosting hundreds of thousands of servers. Currently, the data centres are based on general purpose processors that provide high flexibility buts lack the energy efficiency of customized accelerators. VINEYARD aims to develop an integrated platform for energy-efficient data centres based on new servers with novel, coarse-grain and fine-grain, programmable hardware accelerators. It will, also, build a high-level programming framework for allowing end-users to seamlessly utilize these accelerators in heterogeneous computing systems by employing typical data-centre programming frameworks (e.g. MapReduce, Storm, Spark, etc.). This programming framework will, further, allow the hardware accelerators to be swapped in and out of the heterogeneous infrastructure so as to offer high flexibility and energy efficiency. VINEYARD will foster the expansion of the soft-IP core industry, currently limited in the embedded systems, to the data-centre market. VINEYARD plans to demonstrate the advantages of its approach in three real use-cases (a) a bio-informatics application for high-accuracy brain modeling, (b) two critical financial applications, and (c) a big-data analysis application.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: Staff who provide end-of-life care to children not only have to deal with their own sense of loss, but also that of bereaved families. There is a dearth of knowledge on how they cope with these challenges.
Aim: The aim of this review is to explore the experiences of health care professionals who provide end-of-life care to children in order to inform the development of interventions to support them, thereby improving the quality of paediatric care for both children and their families.
Data sources: Searches included CINAHL, MEDLINE, Web of Science, EMBASE, PsychINFO, and The Cochrane Library in June 2015, with no date restrictions. Additional literature was uncovered from searching reference lists of relevant studies, along with contacting experts in the field of paediatric palliative care.
Design: This was a systematic mixed studies review. Study selection, appraisal and data extraction were conducted by two independent researchers. Integrative thematic analysis was used to synthesise the data.
Results: The 16 qualitative, six quantitative, and eight mixed-method studies identified included healthcare professionals in a range of settings. Key themes identified rewards and challenges of providing end-of-life care to children, the impact on staff’s personal and professional lives, coping strategies, and key approaches to help support staff in their role.
Conclusions: Education focusing on the unique challenges of providing end-of-life care to children and the importance of self-care, along with timely multidisciplinary debriefing are key strategies for improving healthcare staffs’ experiences, and as such the quality of care they provide.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background There is increasing interest in how culture may affect the quality of healthcare services, and previous research has shown that ‘treatment culture’—of which there are three categories (resident centred, ambiguous and traditional)—in a nursing home may influence prescribing of psychoactive medications. Objective The objective of this study was to explore and understand treatment culture in prescribing of psychoactive medications for older people with dementia in nursing homes. Method Six nursing homes—two from each treatment culture category—participated in this study. Qualitative data were collected through semi-structured interviews with nursing home staff and general practitioners (GPs), which sought to determine participants’ views on prescribing and administration of psychoactive medication, and their understanding of treatment culture and its potential influence on prescribing of psychoactive drugs. Following verbatim transcription, the data were analysed and themes were identified, facilitated by NVivo and discussion within the research team. Results Interviews took place with five managers, seven nurses, 13 care assistants and two GPs. Four themes emerged: the characteristics of the setting, the characteristics of the individual, relationships and decision making. The characteristics of the setting were exemplified by views of the setting, daily routines and staff training. The characteristics of the individual were demonstrated by views on the personhood of residents and staff attitudes. Relationships varied between staff within and outside the home. These relationships appeared to influence decision making about prescribing of medications. The data analysis found that each home exhibited traits that were indicative of its respective assigned treatment culture. Conclusion Nursing home treatment culture appeared to be influenced by four main themes. Modification of these factors may lead to a shift in culture towards a more flexible, resident-centred culture and a reduction in prescribing and use of psychoactive medication. 

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Tide gauge data are identified as legacy data given the radical transition between observation method and required output format associated with tide gauges over the 20th-century. Observed water level variation through tide-gauge records is regarded as the only significant basis for determining recent historical variation (decade to century) in mean sea-level and storm surge. There are limited tide gauge records that cover the 20th century, such that the Belfast (UK) Harbour tide gauge would be a strategic long-term (110 years) record, if the full paper-based records (marigrams) were digitally restructured to allow for consistent data analysis. This paper presents the methodology of extracting a consistent time series of observed water levels from the 5 different Belfast Harbour tide gauges’ positions/machine types, starting late 1901. Tide-gauge data was digitally retrieved from the original analogue (daily) records by scanning the marigrams and then extracting the sequential tidal elevations with graph-line seeking software (Ungraph™). This automation of signal extraction allowed the full Belfast series to be retrieved quickly, relative to any manual x–y digitisation of the signal. Restructuring variably lengthed tidal data sets to a consistent daily, monthly and annual file format was undertaken by project-developed software: Merge&Convert and MergeHYD allow consistent water level sampling both at 60 min (past standard) and 10 min intervals, the latter enhancing surge measurement. Belfast tide-gauge data have been rectified, validated and quality controlled (IOC 2006 standards). The result is a consistent annual-based legacy data series for Belfast Harbour that includes over 2 million tidal-level data observations.