46 resultados para Movement Data Analysis
Resumo:
Purpose – The purpose of this paper is to present an analysis of media representation of business ethics within 62 international newspapers to explore the longitudinal and contextual evolution of business ethics and associated terminology. Levels of coverage and contextual analysis of the content of the articles are used as surrogate measures of the penetration of business ethics concepts into society. Design/methodology/approach – This paper uses a text mining application based on two samples of data: analysis of 62 national newspapers in 21 countries from 1990 to 2008; analysis of the content of two samples of articles containing the term business ethics (comprised of 100 newspaper articles spread over an 18-year period from a sample of US and UK newspapers). Findings – The paper demonstrates increased coverage of sustainability topics within the media over the last 18 years associated with events such as the Rio Summit. Whilst some peaks are associated with business ethics scandals, the overall coverage remains steady. There is little apparent use in the media of concepts such as corporate citizenship. The academic community and company ethical codes appear to adopt a wider definition of business ethics more akin to that associated with sustainability, in comparison with the focus taken by the media, especially in the USA. Coverage demonstrates clear regional bias and contextual analysis of the articles in the UK and USA also shows interesting parallels and divergences in the media representation of business ethics. Originality/value – A promising avenue to explore how the evolution of sustainability issues including business ethics can be tracked within a societal context.
Resumo:
A recently generalized theory of perceptual guidance (general tau theory) was used to analyse coordination in skilled movement. The theory posits that (i) guiding movement entails controlling closure of spatial and/or force gaps between effecters and goals, by sensing and regulating the tau s of the gaps (the time-to-closure at current closure rate), (ii) a principal way of coordinating movements is keeping the rs of different gaps in constant ratio (known as tau-coupling), and (iii) intrinsically paced movements are guided and coordinated by tau-coupling onto a tau-guide, tau(g), generated in the nervous system and described by the equation tau(g) = 0.5(t-T-2/t) where T is the duration of the body movement and t is the time from the start of the movement. Kinematic analysis of hand to mouth movements by human adults, with eyes open or closed, indicated that hand guidance was achieved by maintaining, during 80-85% of the movement, the tau-couplings tau(alpha)-tau(t) and tau(t)-tau(g), where tau(t) is tau of the hand-mouth gap, tau(alpha) is tau of the angular gap to be closed by steering the hand and tau(g) is an intrinsic tau-guide.
Resumo:
An optimal search theory, the so-called Levy-flight foraging hypothesis(1), predicts that predators should adopt search strategies known as Levy flights where prey is sparse and distributed unpredictably, but that Brownian movement is sufficiently efficient for locating abundant prey(2-4). Empirical studies have generated controversy because the accuracy of statistical methods that have been used to identify Levy behaviour has recently been questioned(5,6). Consequently, whether foragers exhibit Levy flights in the wild remains unclear. Crucially, moreover, it has not been tested whether observed movement patterns across natural landscapes having different expected resource distributions conform to the theory's central predictions. Here we use maximum-likelihood methods to test for Levy patterns in relation to environmental gradients in the largest animal movement data set assembled for this purpose. Strong support was found for Levy search patterns across 14 species of open-ocean predatory fish (sharks, tuna, billfish and ocean sunfish), with some individuals switching between Levy and Brownian movement as they traversed different habitat types. We tested the spatial occurrence of these two principal patterns and found Levy behaviour to be associated with less productive waters (sparser prey) and Brownian movements to be associated with productive shelf or convergence-front habitats (abundant prey). These results are consistent with the Levy-flight foraging hypothesis(1,7), supporting the contention(8,9) that organism search strategies naturally evolved in such a way that they exploit optimal Levy patterns.
Resumo:
Studies of animal movement are rapidly increasing as tracking technologies make it possible to collect more data of a larger variety of species. Comparisons of animal movement across sites, times, or species are key to asking questions about animal adaptation, responses to climate and land-use change. Thus, great gains can be made by sharing and exchanging animal tracking data. Here we present an animal movement data model that we use within the Movebank web application to describe tracked animals. The model facilitates data comparisons across a broad range of taxa, study designs, and technologies, and is based on the scientific questions that could be addressed with the data.
Resumo:
The application of Eye Tracking (ET) to the study of social functioning in Asperger Syndrome (AS) provides a unique perspective into social attention and cognition in this atypical neurodevelopmental group. Research in this area has shown how ET can capture social attention atypicalities within this group, such as diminished fixations to the eye region when viewing still images and movie clips; increased fixation to the mouth region; reduced face gaze. Issues exist, however, within the literature, where the type (static/dynamic) and the content (ecological validity) of stimuli used appear to affect the nature of the gaze patterns reported. Objectives: Our research aims were: using the same group of adolescents with AS, to compare their viewing patterns to age and IQ matched typically developing (TD) adolescents using stimuli considered to represent a hierarchy of ecological validity, building from static facial images; through a non-verbal movie clip; through verbal footage from real-life conversation; to eye tracking during real-life conversation. Methods: Eleven participants with AS were compared to 11 TD adolescents, matched for age and IQ. In Study 1, participants were shown 2 sets of static facial images (emotion faces, still images taken from the dynamic clips). In Study 2, three dynamic clips were presented (1 non-verbal movie clip, 2 verbal footage from real-life conversation). Study 3 was an exploratory study of eye tracking during a real-life conversation. Eye movements were recorded via a HiSpeeed (240Hz) SMI eye tracker fitted with chin and forehead rests. Various methods of analysis were used, including a paradigm for temporal analysis of the eye movement data. Results: Results from these studies confirmed that the atypical nature of social attention in AS was successfully captured by this paradigm. While results differed across stimulus sets,
collectively they demonstrated how individuals with AS failed to focus on the most socially relevant aspects of the various stimuli presented. There was also evidence that the eye movements of the AS group were atypically affected by the presence of motion and verbal information. Discriminant Function Analysis demonstrated that the ecological validity of stimuli was an important factor in identifying atypicalities associated with AS, with more accurate classifications of AS and TD groups occurring for more naturalistic stimuli (dynamic rather than static). Graphical analysis of temporal sequences of eye movements revealed the atypical manner in which AS participants followed interactions within the dynamic stimuli. Taken together with data on the order of gaze patterns, more subtle atypicalities were detected in the gaze behaviour of AS individuals towards more socially pertinent regions of the dynamic stimuli. Conclusions: These results have potentially important implications for our understanding of deficits in Asperger Syndrome, as they show that, with more naturalistic stimuli, subtle differences in social attention can be detected that
Resumo:
Context. Comet 67P/Churyumov-Gerasimenko is the target of the European Space Agency Rosetta spacecraft rendez-vous mission. Detailed physical characteristation of the comet before arrival is important for mission planning as well as providing a test bed for ground-based observing and data-analysis methods. Aims: To conduct a long-term observational programme to characterize the physical properties of the nucleus of the comet, via ground-based optical photometry, and to combine our new data with all available nucleus data from the literature. Methods: We applied aperture photometry techniques on our imaging data and combined the extracted rotational lightcurves with data from the literature. Optical lightcurve inversion techniques were applied to constrain the spin state of the nucleus and its broad shape. We performed a detailed surface thermal analysis with the shape model and optical photometry by incorporating both into the new Advanced Thermophysical Model (ATPM), along with all available Spitzer 8-24 μm thermal-IR flux measurements from the literature. Results: A convex triangular-facet shape model was determined with axial ratios b/a = 1.239 and c/a = 0.819. These values can vary by as much as 7% in each axis and still result in a statistically significant fit to the observational data. Our best spin state solution has Psid = 12.76137 ± 0.00006 h, and a rotational pole orientated at Ecliptic coordinates λ = 78°(±10°), β = + 58°(±10°). The nucleus phase darkening behaviour was measured and best characterized using the IAU HG system. Best fit parameters are: G = 0.11 ± 0.12 and HR(1,1,0) = 15.31 ± 0.07. Our shape model combined with the ATPM can satisfactorily reconcile all optical and thermal-IR data, with the fit to the Spitzer 24 μm data taken in February 2004 being exceptionally good. We derive a range of mutually-consistent physical parameters for each thermal-IR data set, including effective radius, geometric albedo, surface thermal inertia and roughness fraction. Conclusions: The overall nucleus dimensions are well constrained and strongly imply a broad nucleus shape more akin to comet 9P/Tempel 1, rather than the highly elongated or "bi-lobed" nuclei seen for comets 103P/Hartley 2 or 8P/Tuttle. The derived low thermal inertia of
Resumo:
Research over the past two decades on the Holocene sediments from the tide dominated west side of the lower Ganges delta has focussed on constraining the sedimentary environment through grain size distributions (GSD). GSD has traditionally been assessed through the use of probability density function (PDF) methods (e.g. log-normal, log skew-Laplace functions), but these approaches do not acknowledge the compositional nature of the data, which may compromise outcomes in lithofacies interpretations. The use of PDF approaches in GSD analysis poses a series of challenges for the development of lithofacies models, such as equifinal distribution coefficients and obscuring the empirical data variability. In this study a methodological framework for characterising GSD is presented through compositional data analysis (CODA) plus a multivariate statistical framework. This provides a statistically robust analysis of the fine tidal estuary sediments from the West Bengal Sundarbans, relative to alternative PDF approaches.
Resumo:
Context. The Public European Southern Observatory Spectroscopic Survey of Transient Objects (PESSTO) began as a public spectroscopic survey in April 2012. PESSTO classifies transients from publicly available sources and wide-field surveys, and selects science targets for detailed spectroscopic and photometric follow-up. PESSTO runs for nine months of the year, January - April and August - December inclusive, and typically has allocations of 10 nights per month.
Aims. We describe the data reduction strategy and data products that are publicly available through the ESO archive as the Spectroscopic Survey data release 1 (SSDR1).
Methods. PESSTO uses the New Technology Telescope with the instruments EFOSC2 and SOFI to provide optical and NIR spectroscopy and imaging. We target supernovae and optical transients brighter than 20.5<sup>m</sup> for classification. Science targets are selected for follow-up based on the PESSTO science goal of extending knowledge of the extremes of the supernova population. We use standard EFOSC2 set-ups providing spectra with resolutions of 13-18 Å between 3345-9995 Å. A subset of the brighter science targets are selected for SOFI spectroscopy with the blue and red grisms (0.935-2.53 μm and resolutions 23-33 Å) and imaging with broadband JHK<inf>s</inf> filters.
Results. This first data release (SSDR1) contains flux calibrated spectra from the first year (April 2012-2013). A total of 221 confirmed supernovae were classified, and we released calibrated optical spectra and classifications publicly within 24 h of the data being taken (via WISeREP). The data in SSDR1 replace those released spectra. They have more reliable and quantifiable flux calibrations, correction for telluric absorption, and are made available in standard ESO Phase 3 formats. We estimate the absolute accuracy of the flux calibrations for EFOSC2 across the whole survey in SSDR1 to be typically ∼15%, although a number of spectra will have less reliable absolute flux calibration because of weather and slit losses. Acquisition images for each spectrum are available which, in principle, can allow the user to refine the absolute flux calibration. The standard NIR reduction process does not produce high accuracy absolute spectrophotometry but synthetic photometry with accompanying JHK<inf>s</inf> imaging can improve this. Whenever possible, reduced SOFI images are provided to allow this.
Conclusions. Future data releases will focus on improving the automated flux calibration of the data products. The rapid turnaround between discovery and classification and access to reliable pipeline processed data products has allowed early science papers in the first few months of the survey.
Resumo:
This paper outlines a forensic method for analysing the energy, environmental and comfort performance of a building. The method has been applied to a recently developed event space in an Irish public building, which was evaluated using on-site field studies, data analysis, building simulation and occupant surveying. The method allows for consideration of both the technological and anthropological aspects of the building in use and for the identification of unsustainable operational practice and emerging problems. The forensic analysis identified energy savings of up to 50%, enabling a more sustainable, lower-energy operational future for the building. The building forensic analysis method presented in this paper is now planned for use in other public and commercial buildings.
Resumo:
Perfect information is seldom available to man or machines due to uncertainties inherent in real world problems. Uncertainties in geographic information systems (GIS) stem from either vague/ambiguous or imprecise/inaccurate/incomplete information and it is necessary for GIS to develop tools and techniques to manage these uncertainties. There is a widespread agreement in the GIS community that although GIS has the potential to support a wide range of spatial data analysis problems, this potential is often hindered by the lack of consistency and uniformity. Uncertainties come in many shapes and forms, and processing uncertain spatial data requires a practical taxonomy to aid decision makers in choosing the most suitable data modeling and analysis method. In this paper, we: (1) review important developments in handling uncertainties when working with spatial data and GIS applications; (2) propose a taxonomy of models for dealing with uncertainties in GIS; and (3) identify current challenges and future research directions in spatial data analysis and GIS for managing uncertainties.
Resumo:
A substantial proportion of aetiological risks for many cancers and chronic diseases remain unexplained. Using geochemical soil and stream water samples collected as part of the Tellus Project studies, current research is investigating naturally occurring background levels of potentially toxic elements (PTEs) in soils and stream sediments and their possible relationship with progressive chronic kidney disease (CKD). The Tellus geological mapping project, Geological Survey Northern Ireland, collected soil sediment and stream water samples on a grid of one sample site every 2 km2 across the rural areas of Northern Ireland resulting in an excess of 6800 soil sampling locations and more than 5800 locations for stream water sampling. Accumulation of several PTEs including arsenic, cadmium, chromium, lead and mercury have been linked with human health and implicated in renal function decline. The hypothesis is that long-term exposure will result in cumulative exposure to PTEs and act as risk factor(s) for cancer and diabetes related CKD and its progression. The ‘bioavailable’ fraction of total PTE soil concentration depends on the ‘bioaccessible’ proportion through an exposure pathway. Recent work has explored this bioaccessible fraction for a range of PTEs across Northern Ireland. In this study the compositional nature of the multivariate geochemical PTE variables and bioaccessible data is explored to augment the investigation into the potential relationship between PTEs, bioaccessibility and disease data.
Resumo:
A first stage collision database is assembled which contains electron-impact excitation, ionization,\r and recombination rate coefficients for B, B + , B 2+ , B 3+ , and B 4+ . The first stage database\r is constructed using the R-matrix with pseudostates, time-dependent close-coupling, and perturbative\r distorted-wave methods. A second stage collision database is then assembled which contains\r generalized collisional-radiative ionization, recombination, and power loss rate coefficients as a\r function of both temperature and density. The second stage database is constructed by solution of\r the collisional-radiative equations in the quasi-static equilibrium approximation using the first\r stage database. Both collision database stages reside in electronic form at the IAEA Labeled Atomic\r Data Interface (ALADDIN) database and the Atomic Data Analysis Structure (ADAS) open database.
Resumo:
Emerging web applications like cloud computing, Big Data and social networks have created the need for powerful centres hosting hundreds of thousands of servers. Currently, the data centres are based on general purpose processors that provide high flexibility buts lack the energy efficiency of customized accelerators. VINEYARD aims to develop an integrated platform for energy-efficient data centres based on new servers with novel, coarse-grain and fine-grain, programmable hardware accelerators. It will, also, build a high-level programming framework for allowing end-users to seamlessly utilize these accelerators in heterogeneous computing systems by employing typical data-centre programming frameworks (e.g. MapReduce, Storm, Spark, etc.). This programming framework will, further, allow the hardware accelerators to be swapped in and out of the heterogeneous infrastructure so as to offer high flexibility and energy efficiency. VINEYARD will foster the expansion of the soft-IP core industry, currently limited in the embedded systems, to the data-centre market. VINEYARD plans to demonstrate the advantages of its approach in three real use-cases (a) a bio-informatics application for high-accuracy brain modeling, (b) two critical financial applications, and (c) a big-data analysis application.
Resumo:
Background There is increasing interest in how culture may affect the quality of healthcare services, and previous research has shown that ‘treatment culture’—of which there are three categories (resident centred, ambiguous and traditional)—in a nursing home may influence prescribing of psychoactive medications. Objective The objective of this study was to explore and understand treatment culture in prescribing of psychoactive medications for older people with dementia in nursing homes. Method Six nursing homes—two from each treatment culture category—participated in this study. Qualitative data were collected through semi-structured interviews with nursing home staff and general practitioners (GPs), which sought to determine participants’ views on prescribing and administration of psychoactive medication, and their understanding of treatment culture and its potential influence on prescribing of psychoactive drugs. Following verbatim transcription, the data were analysed and themes were identified, facilitated by NVivo and discussion within the research team. Results Interviews took place with five managers, seven nurses, 13 care assistants and two GPs. Four themes emerged: the characteristics of the setting, the characteristics of the individual, relationships and decision making. The characteristics of the setting were exemplified by views of the setting, daily routines and staff training. The characteristics of the individual were demonstrated by views on the personhood of residents and staff attitudes. Relationships varied between staff within and outside the home. These relationships appeared to influence decision making about prescribing of medications. The data analysis found that each home exhibited traits that were indicative of its respective assigned treatment culture. Conclusion Nursing home treatment culture appeared to be influenced by four main themes. Modification of these factors may lead to a shift in culture towards a more flexible, resident-centred culture and a reduction in prescribing and use of psychoactive medication.