850 resultados para High-dimensional data visualization


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Observational studies suggest that people with a high serum 25-hydroxyvitamin D (25(OH)D) concentration may have reduced risk of chronic diseases such as osteoporosis, multiple sclerosis, type 1 diabetes, cardiovascular disease, and some cancers. The AusD Study (A Quantitative Assessment of Solar UV Exposure for Vitamin D Synthesis in Australian Adults) was conducted to clarify the relationships between ultraviolet (UV) radiation exposure, dietary intake of vitamin D, and serum 25(OH)D concentration among Australian adults residing in Townsville (19.3°S), Brisbane (27.5°S), Canberra (35.3°S), and Hobart (42.8°S). Participants aged 18-75 years were recruited from the Australian Electoral Roll between 2009 and 2010. Measurements were made of height, weight, waist:hip ratio, skin, hair, and eye color, blood pressure, and grip strength. Participants completed a questionnaire on sun exposure and vitamin D intake, together with 10 days of personal UV dosimetry and an associated sun-exposure and physical-activity diary that was temporally linked to a blood test for measurement of 25(OH)D concentration. Ambient solar UV radiation was also monitored at all study sites. We collected comprehensive, high-quality data from 1,002 participants (459 males, 543 females) assessed simultaneously across a range of latitudes and through all seasons. Here we describe the scientific and methodological issues considered in designing the AusD Study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Creative Statement: “There are those who see Planet Earth as a gigantic living being, one that feeds and nurtures humanity and myriad other species – an entity that must be cared for. Then there are those who see it as a rock full of riches to be pilfered heedlessly in a short-term quest for over-abundance. This ‘cradle to grave’ mentality, it would seem, is taking its toll (unless you’re a virulent disbeliever in climate change). Why not, ask artists Priscilla Bracks and Gavin Sade, take a different approach? To this end they have set out on a near impossible task; to visualise the staggering quantity of carbon produced by Australia every year. Their eerie, glowing plastic cube resembles something straight out of Dr Who or The X Files. And, like the best science fiction, it has technical realities at its heart. Every One, Every Day tangibly illustrates our greenhouse gas output – its 27m3 volume is approximately the amount of green-house gas emitted per capita, daily. Every One, Every Dayis lit by an array of LED’s displaying light patterns representing energy use generated by data from the Australian Energy Market. Every One, Every Day was formed from recycled, polyethylene – used milk bottles – ‘lent’ to the artists by a Visy recycling facility. At the end of the Vivid Festival this plastic will be returned to Visy, where it will re-enter the stream of ‘technical nutrients.’ Could we make another world? One that emulates the continuing cycles of nature? One that uses our ‘technical nutrients’ such as plastic and steel in continual cycles, just like a deciduous tree dropping leaves to compost itself and keep it’s roots warm and moist?” (Ashleigh Crawford. Melbourne – April, 2013) Artistic Research Statement: The research focus of this work is on exploring how to represent complex statistics and data at a human scale, and how produce a work where a large percentage of the materials could be recycled. The surface of Every One, Every Day is clad in tiles made from polyethylene, from primarily recycled milk bottles, ‘lent’ to the artists by the Visy recycling facility in Sydney. The tiles will be returned to Visy for recycling. As such the work can be viewed as an intervention in the industrial ecology of polyethylene, and in the process demonstrates how to sustain cycles of technical materials – by taking the output of a recycling facility back to a manufacturer to produce usable materials. In terms of data visualisation, Every One, Every Day takes the form of a cube with a volume of 27 cubic meters. The annual per capita emissions figures for Australia are cited as ranging between 18 to 25 tons. Assuming the lower figure, 18tons per capital annually, the 27 cubic meters represents approximately one day per capita of CO2 emissions – where CO2 is a gas at 15C and 1 atmosphere of pressure. The work also explores real time data visualisation by using an array of 600 controllable LEDs inside the cube. Illumination patterns are derived from a real time data from the Australian Energy Market, using the dispatch interval price and demand graph for New South Wales. The two variables of demand and price are mapped to properties of the illumination - hue, brightness, movement, frequency etc. The research underpinning the project spanned industrial ecology to data visualization and public art practices. The result is that Every One, Every Day is one of the first public artworks that successfully bring together materials, physical form, and real time data representation in a unified whole.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Large communities built around social media on the Internet offer an opportunity to augment analytical customer relationship management (CRM) strategies. The purpose of this paper is to provide direction to advance the conceptual design of business intelligence (BI) systems for implementing CRM strategies. After introducing social CRM and social BI as emerging fields of research, the authors match CRM strategies with a re-engineered conceptual data model of Facebook in order to illustrate the strategic value of these data. Subsequently, the authors design a multi-dimensional data model for social BI and demonstrate its applicability by designing management reports in a retail scenario. Building on the service blueprinting framework, the authors propose a structured research agenda for the emerging field of social BI.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Custom designed for display on the Cube Installation situated in the new Science and Engineering Centre (SEC) at QUT, the ECOS project is a playful interface that uses real-time weather data to simulate how a five-star energy building operates in climates all over the world. In collaboration with the SEC building managers, the ECOS Project incorporates energy consumption and generation data of the building into an interactive simulation, which is both engaging to users and highly informative, and which invites play and reflection on the roles of green buildings. ECOS focuses on the principle that humans can have both a positive and negative impact on ecosystems with both local and global consequence. The ECOS project draws on the practice of Eco-Visualisation, a term used to encapsulate the important merging of environmental data visualization with the philosophy of sustainability. Holmes (2007) uses the term Eco-Visualisation (EV) to refer to data visualisations that ‘display the real time consumption statistics of key environmental resources for the goal of promoting ecological literacy’. EVs are commonly artifacts of interaction design, information design, interface design and industrial design, but are informed by various intellectual disciplines that have shared interests in sustainability. As a result of surveying a number of projects, Pierce, Odom and Blevis (2008) outline strategies for designing and evaluating effective EVs, including ‘connecting behavior to material impacts of consumption, encouraging playful engagement and exploration with energy, raising public awareness and facilitating discussion, and stimulating critical reflection.’ Consequently, Froehlich (2010) and his colleagues also use the term ‘Eco-feedback technology’ to describe the same field. ‘Green IT’ is another variation which Tomlinson (2010) describes as a ‘field at the juncture of two trends… the growing concern over environmental issues’ and ‘the use of digital tools and techniques for manipulating information.’ The ECOS Project team is guided by these principles, but more importantly, propose an example for how these principles may be achieved. The ECOS Project presents a simplified interface to the very complex domain of thermodynamic and climate modeling. From a mathematical perspective, the simulation can be divided into two models, which interact and compete for balance – the comfort of ECOS’ virtual denizens and the ecological and environmental health of the virtual world. The comfort model is based on the study of psychometrics, and specifically those relating to human comfort. This provides baseline micro-climatic values for what constitutes a comfortable working environment within the QUT SEC buildings. The difference between the ambient outside temperature (as determined by polling the Google Weather API for live weather data) and the internal thermostat of the building (as set by the user) allows us to estimate the energy required to either heat or cool the building. Once the energy requirements can be ascertained, this is then balanced with the ability of the building to produce enough power from green energy sources (solar, wind and gas) to cover its energy requirements. Calculating the relative amount of energy produced by wind and solar can be done by, in the case of solar for example, considering the size of panel and the amount of solar radiation it is receiving at any given time, which in turn can be estimated based on the temperature and conditions returned by the live weather API. Some of these variables can be altered by the user, allowing them to attempt to optimize the health of the building. The variables that can be changed are the budget allocated to green energy sources such as the Solar Panels, Wind Generator and the Air conditioning to control the internal building temperature. These variables influence the energy input and output variables, modeled on the real energy usage statistics drawn from the SEC data provided by the building managers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

My practice-led research explores and maps workflows for generating experimental creative work involving inertia based motion capture technology. Motion capture has often been used as a way to bridge animation and dance resulting in abstracted visuals outcomes. In early works this process was largely done by rotoscoping, reference footage and mechanical forms of motion capture. With the evolution of technology, optical and inertial forms of motion capture are now more accessible and able to accurately capture a larger range of complex movements. The creative work titled “Contours in Motion” was the first in a series of studies on captured motion data used to generating experimental visual forms that reverberate in space and time. With the source or ‘seed’ comes from using an Xsens MVN - Inertial Motion Capture system to capture spontaneous dance movements, with the visual generation conducted through a customised dynamics simulation. The aim of the creative work was to diverge way from a standard practice of using particle system and/or a simple re-targeting of the motion data to drive a 3d character as a means to produce abstracted visual forms. To facilitate this divergence a virtual dynamic object was tether to a selection of data points from a captured performance. The proprieties of the dynamic object were then adjusted to balance the influences from the human movement data with the influence of computer based randomization. The resulting outcome was a visual form that surpassed simple data visualization to project the intent of the performer’s movements into a visual shape itself. The reported outcomes from this investigation have contributed to a larger study on the use of motion capture in the generative arts, furthering the understanding of and generating theories on practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This contribution outlines Synchrotron-based X-ray micro-tomography and its potential use in structural geology and rock mechanics. The paper complements several recent reviews of X-ray microtomography. We summarize the general approach to data acquisition, post-processing as well as analysis and thereby aim to provide an entry point for the interested reader. The paper includes tables listing relevant beamlines, a list of all available imaging techniques, and available free and commercial software packages for data visualization and quantification. We highlight potential applications in a review of relevant literature including time-resolved experiments and digital rock physics. The paper concludes with a report on ongoing developments and upgrades at synchrotron facilities to frame the future possibilities for imaging sub-second processes in centimetre-sized samples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the current era of global economic instability, business and industry have already identified a widening gap between graduate skills and employability. An important element of this is the lack of entrepreneurial skills in graduates. This Teaching Fellowship investigated two sides of a story about entrepreneurial skills and their teaching. Senior players in the innovation commercialisation industry, a high profile entrepreneurial sector, were surveyed to gauge their needs and experiences of graduates they employ. International contexts of entrepreneurship education were investigated to explore how their teaching programs impart the skills of entrepreneurship. Such knowledge is an essential for the design of education programs that can deliver the entrepreneurial skills deemed important by industry for future sustainability. Two programs of entrepreneurship education are being implemented at QUT that draw on the best practice exemplars investigated during this Fellowship. The QUT Innovation Space (QIS) focuses on capturing the innovation and creativity of students, staff and others. The QIS is a physical and virtual meeting and networking space; a connected community enhancing the engagement of participants. The Q_Hatchery is still embryonic; but it is intended to be an innovation community that brings together nascent entrepreneurial businesses to collaborate, train and support each other. There is a niche between concept product and business incubator where an experiential learning environment for otherwise isolated ‘garage-at-home’ businesses could improve success rates. The QIS and the Q_Hatchery serve as living research laboratories to trial the concepts emerging from the skills survey. The survey of skills requirements of the innovation commercialisation industry has produced a large and high quality data set still being explored. Work experience as an employability factor has already emerged as an industry requirement that provides employee maturity. Exploratory factor analysis of the skills topics surveyed has led to a process-based conceptual model for teaching and learning higher-order entrepreneurial skills. Two foundational skills domains (Knowledge, Awareness) are proposed as prerequisites which allow individuals with a suite of early stage entrepreneurial and behavioural skills (Pre-leadership) to further leverage their careers into a leadership role in industry with development of skills around higher order elements of entrepreneurship, management in new business ventures and progressing winning technologies to market. The next stage of the analysis is to test the proposed model through structured equation modelling. Another factor that emerged quickly from the survey analysis broadens the generic concept of team skills currently voiced in Australian policy documents discussing the employability agenda. While there was recognition of the role of sharing, creating and using knowledge in a team-based interdisciplinary context, the adoption and adaptation of behaviours and attitudes of other team members of different disciplinary backgrounds (interprofessionalism) featured as an issue. Most undergraduates are taught and undertake teamwork in silos and, thus, seldom experience a true real-world interdisciplinary environment. Enhancing the entrepreneurial capacity of Australian industry is essential for the economic health of the country and can only be achieved by addressing the lack of entrepreneurial skills in graduates from the higher education system. This Fellowship has attempted to address this deficiency by identifying the skills requirements and providing frameworks for their teaching.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Functional MRI studies commonly refer to activation patterns as being localized in specific Brodmann areas, referring to Brodmann’s divisions of the human cortex based on cytoarchitectonic boundaries [3]. Typically, Brodmann areas that match regions in the group averaged functional maps are estimated by eye, leading to inaccurate parcellations and significant error. To avoid this limitation, we developed a method using high-dimensional nonlinear registration to project the Brodmann areas onto individual 3D co-registered structural and functional MRI datasets, using an elastic deformation vector field in the cortical parameter space. Based on a sulcal pattern matching approach [11], an N=27 scan single subject atlas (the Colin Holmes atlas [15]) with associated Brodmann areas labeled on its surface, was deformed to match 3D cortical surface models generated from individual subjects’ structural MRIs (sMRIs). The deformed Brodmann areas were used to quantify and localize functional MRI (fMRI) BOLD activation during the performance of the Tower of London task [7].

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is an increasing need in biology and clinical medicine to robustly and reliably measure tens-to-hundreds of peptides and proteins in clinical and biological samples with high sensitivity, specificity, reproducibility and repeatability. Previously, we demonstrated that LC-MRM-MS with isotope dilution has suitable performance for quantitative measurements of small numbers of relatively abundant proteins in human plasma, and that the resulting assays can be transferred across laboratories while maintaining high reproducibility and quantitative precision. Here we significantly extend that earlier work, demonstrating that 11 laboratories using 14 LC-MS systems can develop, determine analytical figures of merit, and apply highly multiplexed MRM-MS assays targeting 125 peptides derived from 27 cancer-relevant proteins and 7 control proteins to precisely and reproducibly measure the analytes in human plasma. To ensure consistent generation of high quality data we incorporated a system suitability protocol (SSP) into our experimental design. The SSP enabled real-time monitoring of LC-MRM-MS performance during assay development and implementation, facilitating early detection and correction of chromatographic and instrumental problems. Low to sub-nanogram/mL sensitivity for proteins in plasma was achieved by one-step immunoaffinity depletion of 14 abundant plasma proteins prior to analysis. Median intra- and inter-laboratory reproducibility was <20%, sufficient for most biological studies and candidate protein biomarker verification. Digestion recovery of peptides was assessed and quantitative accuracy improved using heavy isotope labeled versions of the proteins as internal standards. Using the highly multiplexed assay, participating laboratories were able to precisely and reproducibly determine the levels of a series of analytes in blinded samples used to simulate an inter-laboratory clinical study of patient samples. Our study further establishes that LC-MRM-MS using stable isotope dilution, with appropriate attention to analytical validation and appropriate quality c`ontrol measures, enables sensitive, specific, reproducible and quantitative measurements of proteins and peptides in complex biological matrices such as plasma.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In an estuary, mixing and dispersion result from a combination of large-scale advection and smallscale turbulence, which are complex to estimate. The predictions of scalar transport and mixing are often inferred and rarely accurate, due to inadequate understanding of the contributions of these difference scales to estuarine recirculation. A multi-device field study was conducted in a small sub-tropical estuary under neap tide conditions with near-zero fresh water discharge for about 48 hours. During the study, acoustic Doppler velocimeters (ADV) were sampled at high frequency (50 Hz), while an acoustic Doppler current profiler (ADCP) and global positioning system (GPS) tracked drifters were used to obtain some lower frequency spatial distribution of the flow parameters within the estuary. The velocity measurements were complemented with some continuous measurement of water depth, conductivity, temperature and some other physiochemical parameters. Thorough quality control was carried out by implementation of relevant error removal filters on the individual data set to intercept spurious data. A triple decomposition (TD) technique was introduced to access the contributions of tides, resonance and ‘true’ turbulence in the flow field. The time series of mean flow measurements for both the ADCP and drifter were consistent with those of the mean ADV data when sampled within a similar spatial domain. The tidal scale fluctuation of velocity and water level were used to examine the response of the estuary to tidal inertial current. The channel exhibited a mixed type wave with a typical phase-lag between 0.035π– 0.116π. A striking feature of the ADV velocity data was the slow fluctuations, which exhibited large amplitudes of up to 50% of the tidal amplitude, particularly in slack waters. Such slow fluctuations were simultaneously observed in a number of physiochemical properties of the channel. The ensuing turbulence field showed some degree of anisotropy. For all ADV units, the horizontal turbulence ratio ranged between 0.4 and 0.9, and decreased towards the bed, while the vertical turbulence ratio was on average unity at z = 0.32 m and approximately 0.5 for the upper ADV (z = 0.55 m). The result of the statistical analysis suggested that the ebb phase turbulence field was dominated by eddies that evolved from ejection type process, while that of the flood phase contained mixed eddies with significant amount related to sweep type process. Over 65% of the skewness values fell within the range expected of a finite Gaussian distribution and the bulk of the excess kurtosis values (over 70%) fell within the range of -0.5 and +2. The TD technique described herein allowed the characterisation of a broader temporal scale of fluctuations of the high frequency data sampled within the durations of a few tidal cycles. The study provides characterisation of the ranges of fluctuation required for an accurate modelling of shallow water dispersion and mixing in a sub-tropical estuary.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Studies of cerebral asymmetry can open doors to understanding the functional specialization of each brain hemisphere, and how this is altered in disease. Here we examined hemispheric asymmetries in fiber architecture using diffusion tensor imaging (DTI) in 100 subjects, using high-dimensional fluid warping to disentangle shape differences from measures sensitive to myelination. Confounding effects of purely structural asymmetries were reduced by using co-registered structural images to fluidly warp 3D maps of fiber characteristics (fractional and geodesic anisotropy) to a structurally symmetric minimal deformation template (MDT). We performed a quantitative genetic analysis on 100 subjects to determine whether the sources of the remaining signal asymmetries were primarily genetic or environmental. A twin design was used to identify the heritable features of fiber asymmetry in various regions of interest, to further assist in the discovery of genes influencing brain micro-architecture and brain lateralization. Genetic influences and left/right asymmetries were detected in the fiber architecture of the frontal lobes, with minor differences depending on the choice of registration template.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We used diffusion tensor magnetic resonance imaging (DTI) to reveal the extent of genetic effects on brain fiber microstructure, based on tensor-derived measures, in 22 pairs of monozygotic (MZ) twins and 23 pairs of dizygotic (DZ) twins (90 scans). After Log-Euclidean denoising to remove rank-deficient tensors, DTI volumes were fluidly registered by high-dimensional mapping of co-registered MP-RAGE scans to a geometrically-centered mean neuroanatomical template. After tensor reorientation using the strain of the 3D fluid transformation, we computed two widely used scalar measures of fiber integrity: fractional anisotropy (FA), and geodesic anisotropy (GA), which measures the geodesic distance between tensors in the symmetric positive-definite tensor manifold. Spatial maps of intraclass correlations (r) between MZ and DZ twins were compared to compute maps of Falconer's heritability statistics, i.e. the proportion of population variance explainable by genetic differences among individuals. Cumulative distribution plots (CDF) of effect sizes showed that the manifold measure, GA, comparably the Euclidean measure, FA, in detecting genetic correlations. While maps were relatively noisy, the CDFs showed promise for detecting genetic influences on brain fiber integrity as the current sample expands.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

IODP Expedition 340 successfully drilled a series of sites offshore Montserrat, Martinique and Dominica in the Lesser Antilles from March to April 2012. These are among the few drill sites gathered around volcanic islands, and the first scientific drilling of large and likely tsunamigenic volcanic island-arc landslide deposits. These cores provide evidence and tests of previous hypotheses for the composition and origin of those deposits. Sites U1394, U1399, and U1400 that penetrated landslide deposits recovered exclusively seafloor sediment, comprising mainly turbidites and hemipelagic deposits, and lacked debris avalanche deposits. This supports the concepts that i/ volcanic debris avalanches tend to stop at the slope break, and ii/ widespread and voluminous failures of preexisting low-gradient seafloor sediment can be triggered by initial emplacement of material from the volcano. Offshore Martinique (U1399 and 1400), the landslide deposits comprised blocks of parallel strata that were tilted or microfaulted, sometimes separated by intervals of homogenized sediment (intense shearing), while Site U1394 offshore Montserrat penetrated a flat-lying block of intact strata. The most likely mechanism for generating these large-scale seafloor sediment failures appears to be propagation of a decollement from proximal areas loaded and incised by a volcanic debris avalanche. These results have implications for the magnitude of tsunami generation. Under some conditions, volcanic island landslide deposits composed of mainly seafloor sediment will tend to form smaller magnitude tsunamis than equivalent volumes of subaerial block-rich mass flows rapidly entering water. Expedition 340 also successfully drilled sites to access the undisturbed record of eruption fallout layers intercalated with marine sediment which provide an outstanding high-resolution data set to analyze eruption and landslides cycles, improve understanding of magmatic evolution as well as offshore sedimentation processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As financial markets have become increasingly integrated internationally, the topic of volatility transmission across these markets has become more important. This thesis investigates how the volatility patterns of the world's main financial centres differ across foreign exchange, equity, and bond markets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

High-resolution data from the TRMM satellite shows that sea surface temperature (SST) cools by 3 degrees C under the tracks of pre-monsoon tropical cyclones in the north Indian Ocean. However, even the strongest post-monsoon cyclones do not cool the open north Bay of Bengal. In this region, a shallow layer of freshwater from river runoff and monsoon rain caps a deep warm layer. Therefore, storm-induced mixing is not deep, and it entrains warm subsurface water. It is possible that the hydrography of the post-monsoon north Bay favours intense cyclones.