956 resultados para dynamic analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes an experiment developed to study the performance of virtual agent animated cues within digital interfaces. Increasingly, agents are used in virtual environments as part of the branding process and to guide user interaction. However, the level of agent detail required to establish and enhance efficient allocation of attention remains unclear. Although complex agent motion is now possible, it is costly to implement and so should only be routinely implemented if a clear benefit can be shown. Pevious methods of assessing the effect of gaze-cueing as a solution to scene complexity have relied principally on two-dimensional static scenes and manual peripheral inputs. Two experiments were run to address the question of agent cues on human-computer interfaces. Both experiments measured the efficiency of agent cues analyzing participant responses either by gaze or by touch respectively. In the first experiment, an eye-movement recorder was used to directly assess the immediate overt allocation of attention by capturing the participant’s eyefixations following presentation of a cueing stimulus. We found that a fully animated agent could speed up user interaction with the interface. When user attention was directed using a fully animated agent cue, users responded 35% faster when compared with stepped 2-image agent cues, and 42% faster when compared with a static 1-image cue. The second experiment recorded participant responses on a touch screen using same agent cues. Analysis of touch inputs confirmed the results of gaze-experiment, where fully animated agent made shortest time response with a slight decrease on the time difference comparisons. Responses to fully animated agent were 17% and 20% faster when compared with 2-image and 1-image cue severally. These results inform techniques aimed at engaging users’ attention in complex scenes such as computer games and digital transactions within public or social interaction contexts by demonstrating the benefits of dynamic gaze and head cueing directly on the users’ eye movements and touch responses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reflective modulators based on the combination of an electroabsorption modulator (EAM) and semiconductor optical amplifier (SOA) are attractive devices for applications in long reach carrier distributed passive optical networks (PONs) due to the gain provided by the SOA and the high speed and low chirp modulation of the EAM. Integrated R-EAM-SOAs have experimentally shown two unexpected and unintuitive characteristics which are not observed in a single pass transmission SOA: the clamping of the output power of the device around a maximum value and low patterning distortion despite the SOA being in a regime of gain saturation. In this thesis a detailed analysis is carried out using both experimental measurements and modelling in order to understand these phenomena. For the first time it is shown that both the internal loss between SOA and R-EAM and the SOA gain play an integral role in the behaviour of gain saturated R-EAM-SOAs. Internal loss and SOA gain are also optimised for use in a carrier distributed PONs in order to access both the positive effect of output power clamping, and hence upstream dynamic range reduction, combined with low patterning operation of the SOA Reflective concepts are also gaining interest for metro transport networks and short reach, high bit rate, inter-datacentre links. Moving the optical carrier generation away from the transmitter also has potential advantages for these applications as it avoids the need for cooled photonics being placed directly on hot router line-cards. A detailed analysis is carried out in this thesis on a novel colourless reflective duobinary modulator, which would enable wavelength flexibility in a power-efficient reflective metro node.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Twitter has changed the dynamic of the academic conference. Before Twitter, delegate participation was primarily dependent on attendance and feedback was limited to post-event survey. With Twitter, delegates have become active participants. They pass comment, share reactions and critique presentations, all the while generating a running commentary. This study examines this phenomenon using the Academic & Special Libraries (A&SL) conference 2015 (hashtag #asl2015) as a case study. A post-conference survey was undertaken asking delegates how and why they used Twitter at #asl2015. A content and conceptual analysis of tweets was conducted using Topsy and Storify. This analysis examined how delegates interacted with presentations, which sessions generated most activity on the timeline and the type of content shared. Actual tweet activity and volume per presentation was compared to survey responses. Finally, recommendations on Twitter engagement for conference organisers and presenters are provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe a strategy for Markov chain Monte Carlo analysis of non-linear, non-Gaussian state-space models involving batch analysis for inference on dynamic, latent state variables and fixed model parameters. The key innovation is a Metropolis-Hastings method for the time series of state variables based on sequential approximation of filtering and smoothing densities using normal mixtures. These mixtures are propagated through the non-linearities using an accurate, local mixture approximation method, and we use a regenerating procedure to deal with potential degeneracy of mixture components. This provides accurate, direct approximations to sequential filtering and retrospective smoothing distributions, and hence a useful construction of global Metropolis proposal distributions for simulation of posteriors for the set of states. This analysis is embedded within a Gibbs sampler to include uncertain fixed parameters. We give an example motivated by an application in systems biology. Supplemental materials provide an example based on a stochastic volatility model as well as MATLAB code.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Vertebrate skin appendages are constructed of keratins produced by multigene families. Alpha (α) keratins are found in all vertebrates, while beta (β) keratins are found exclusively in reptiles and birds. We have studied the molecular evolution of these gene families in the genomes of 48 phylogenetically diverse birds and their expression in the scales and feathers of the chicken. RESULTS: We found that the total number of α-keratins is lower in birds than mammals and non-avian reptiles, yet two α-keratin genes (KRT42 and KRT75) have expanded in birds. The β-keratins, however, demonstrate a dynamic evolution associated with avian lifestyle. The avian specific feather β-keratins comprise a large majority of the total number of β-keratins, but independently derived lineages of aquatic and predatory birds have smaller proportions of feather β-keratin genes and larger proportions of keratinocyte β-keratin genes. Additionally, birds of prey have a larger proportion of claw β-keratins. Analysis of α- and β-keratin expression during development of chicken scales and feathers demonstrates that while α-keratins are expressed in these tissues, the number and magnitude of expressed β-keratin genes far exceeds that of α-keratins. CONCLUSIONS: These results support the view that the number of α- and β-keratin genes expressed, the proportion of the β-keratin subfamily genes expressed and the diversification of the β-keratin genes have been important for the evolution of the feather and the adaptation of birds into multiple ecological niches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

How do separate neural networks interact to support complex cognitive processes such as remembrance of the personal past? Autobiographical memory (AM) retrieval recruits a consistent pattern of activation that potentially comprises multiple neural networks. However, it is unclear how such large-scale neural networks interact and are modulated by properties of the memory retrieval process. In the present functional MRI (fMRI) study, we combined independent component analysis (ICA) and dynamic causal modeling (DCM) to understand the neural networks supporting AM retrieval. ICA revealed four task-related components consistent with the previous literature: 1) medial prefrontal cortex (PFC) network, associated with self-referential processes, 2) medial temporal lobe (MTL) network, associated with memory, 3) frontoparietal network, associated with strategic search, and 4) cingulooperculum network, associated with goal maintenance. DCM analysis revealed that the medial PFC network drove activation within the system, consistent with the importance of this network to AM retrieval. Additionally, memory accessibility and recollection uniquely altered connectivity between these neural networks. Recollection modulated the influence of the medial PFC on the MTL network during elaboration, suggesting that greater connectivity among subsystems of the default network supports greater re-experience. In contrast, memory accessibility modulated the influence of frontoparietal and MTL networks on the medial PFC network, suggesting that ease of retrieval involves greater fluency among the multiple networks contributing to AM. These results show the integration between neural networks supporting AM retrieval and the modulation of network connectivity by behavior.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

© 2014, Springer-Verlag Berlin Heidelberg.The frequency and severity of extreme events are tightly associated with the variance of precipitation. As climate warms, the acceleration in hydrological cycle is likely to enhance the variance of precipitation across the globe. However, due to the lack of an effective analysis method, the mechanisms responsible for the changes of precipitation variance are poorly understood, especially on regional scales. Our study fills this gap by formulating a variance partition algorithm, which explicitly quantifies the contributions of atmospheric thermodynamics (specific humidity) and dynamics (wind) to the changes in regional-scale precipitation variance. Taking Southeastern (SE) United States (US) summer precipitation as an example, the algorithm is applied to the simulations of current and future climate by phase 5 of Coupled Model Intercomparison Project (CMIP5) models. The analysis suggests that compared to observations, most CMIP5 models (~60 %) tend to underestimate the summer precipitation variance over the SE US during the 1950–1999, primarily due to the errors in the modeled dynamic processes (i.e. large-scale circulation). Among the 18 CMIP5 models analyzed in this study, six of them reasonably simulate SE US summer precipitation variance in the twentieth century and the underlying physical processes; these models are thus applied for mechanistic study of future changes in SE US summer precipitation variance. In the future, the six models collectively project an intensification of SE US summer precipitation variance, resulting from the combined effects of atmospheric thermodynamics and dynamics. Between them, the latter plays a more important role. Specifically, thermodynamics results in more frequent and intensified wet summers, but does not contribute to the projected increase in the frequency and intensity of dry summers. In contrast, atmospheric dynamics explains the projected enhancement in both wet and dry summers, indicating its importance in understanding future climate change over the SE US. The results suggest that the intensified SE US summer precipitation variance is not a purely thermodynamic response to greenhouse gases forcing, and cannot be explained without the contribution of atmospheric dynamics. Our analysis provides important insights to understand the mechanisms of SE US summer precipitation variance change. The algorithm formulated in this study can be easily applied to other regions and seasons to systematically explore the mechanisms responsible for the changes in precipitation extremes in a warming climate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A procedure for evaluating the dynamic structural response of elastic solid domains is presented. A prerequisite for the analysis of dynamic fluid–structure interaction is the use of a consistent set of finite volume (FV) methods on a single unstructured mesh. This paper describes a three-dimensional (3D) FV, vertex-based method for dynamic solid mechanics. A novel Newmark predictor–corrector implicit scheme was developed to provide time accurate solutions and the scheme was evaluated on a 3D cantilever problem. By employing a small amount of viscous damping, very accurate predictions of the fundamental natural frequency were obtained with respect to both the amplitude and period of oscillation. This scheme has been implemented into the multi-physics modelling software framework, PHYSICA, for later application to full dynamic fluid structure interaction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computational modelling of dynamic fluid–structure interaction (DFSI) is a considerable challenge. Our approach to this class of problems involves the use of a single software framework for all the phenomena involved, employing finite volume methods on unstructured meshes in three dimensions. This method enables time and space accurate calculations in a consistent manner. One key application of DFSI simulation is the analysis of the onset of flutter in aircraft wings, where the work of Yates et al. [Measured and Calculated Subsonic and Transonic Flutter Characteristics of a 45° degree Sweptback Wing Planform in Air and Freon-12 in the Langley Transonic Dynamic Tunnel. NASA Technical Note D-1616, 1963] on the AGARD 445.6 wing planform still provides the most comprehensive benchmark data available. This paper presents the results of a significant effort to model the onset of flutter for the AGARD 445.6 wing planform geometry. A series of key issues needs to be addressed for this computational approach. • The advantage of using a single mesh, in order to eliminate numerical problems when applying boundary conditions at the fluid-structure interface, is counteracted by the challenge of generating a suitably high quality mesh in both the fluid and structural domains. • The computational effort for this DFSI procedure, in terms of run time and memory requirements, is very significant. Practical simulations require even finer meshes and shorter time steps, requiring parallel implementation for operation on large, high performance parallel systems. • The consistency and completeness of the AGARD data in the public domain is inadequate for use in the validation of DFSI codes when predicting the onset of flutter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents innovative work in the development of policy-based autonomic computing. The core of the work is a powerful and flexible policy-expression language AGILE, which facilitates run-time adaptable policy configuration of autonomic systems. AGILE also serves as an integrating platform for other self-management technologies including signal processing, automated trend analysis and utility functions. Each of these technologies has specific advantages and applicability to different types of dynamic adaptation. The AGILE platform enables seamless interoperability of the different technologies to each perform various aspects of self-management within a single application. The various technologies are implemented as object components. Self-management behaviour is specified using the policy language semantics to bind the various components together as required. Since the policy semantics support run-time re-configuration, the self-management architecture is dynamically composable. Additional benefits include the standardisation of the application programmer interface, terminology and semantics, and only a single point of embedding is required.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – A small size cold crucible offers possibilities for melting various electrically conducting materials with a minimal wall contact. Such small samples can be used for express contamination analysis, preparing limited amounts of reactive alloys or experimental material analyses. Aims to present a model to follow the melting process. Design/methodology/approach – The presents a numerical model in which different types of axisymmetric coil configurations are analysed. Findings – The presented numerical model permits dynamically to follow the melting process, the high-frequency magnetic field distribution change, the free surface and the melting front evolution, and the associated turbulent fluid dynamics. The partially solidified skin on the contact to the cold crucible walls and bottom is dynamically predicted. The segmented crucible shape is either cylindrical, hemispherical or arbitrary shaped. Originality/value – The model presented within the paper permits the analysis of melting times, melt shapes, electrical efficiency and particle tracks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the context of trans-dermal drug delivery it is very important to have mechanistic insight into the barrier function of the skin's stratum corneum and the diffusion mechanisms of topically applied drugs. Currently spectroscopic imaging techniques are evolving which enable a spatial examination of various types of samples in a dynamic way. ATR-FTIR imaging opens up the possibility to monitor spatial diffusion profiles across the stratum corneum of a skin sample. Multivariate data analyses methods based on factor analysis are able to provide insight into the large amount of spectroscopically complex and highly overlapping signals generated. Multivariate target factor analysis was used for spectral resolution and local diffusion profiles with time through stratum corneum. A model drug, 4-cyanophenol in polyethylene glycol 600 and water was studied. Results indicate that the average diffusion profiles between spatially different locations show similar profiles despite the heterogeneous nature of the biological sample and the challenging experimental set-up.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examines the roll-out of a collaborative information repository or 'knowledge-base' in a medium-sized UK professional services firm over a six year period. Data from usage logs provides the basis for analysis of the dynamic evolution of social networks around the depository during this time. The adoption pattern follows an 's-curve' and usage exhibits something of a power law distribution, both attributable to network effects and network opposition is associated with organisational performance on a number of indicators. But periodicity in usage is evident and the usage distribution displays an exponential cut-off. Fourier analysis provides some evidence of mathematical complexity in the periodicity. Some implications of complex patterns in social network data for research and management are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Scotia Sea has been a focus of biological- and physical oceanographic study since the Discovery expeditions in the early 1900s. It is a physically energetic region with some of the highest levels of productivity in the Southern Ocean. It is also a region within which there have been greater than average levels of change in upper water column temperature. We describe the results of three cruises transecting the central Scotia Sea from south to north in consecutive years and covering spring, summer and autumn periods. We also report on some community level syntheses using both current-day and historical data from this region. A wide range of parameters were measured during the field campaigns, covering the physical oceanography of the region, air–sea CO2 fluxes, macro- and micronutrient concentrations, the composition and biomass of the nano-, micro- and mesoplankton communities, and the distribution and biomass of Antarctic krill and mesopelagic fish. Process studies examined the effect of iron-stress on the physiology of primary producers, reproduction and egestion in Antarctic krill and the transfer of stable isotopes between trophic layers, from primary consumers up to birds and seals. Community level syntheses included an examination of the biomass-spectra, food-web modelling, spatial analysis of multiple trophic layers and historical species distributions. The spatial analyses in particular identified two distinct community types: a northern warmer water community and a southern cold community, their boundary being broadly consistent with the position of the Southern Antarctic Circumpolar Current Front (SACCF). Temperature and ice cover appeared to be the dominant, over-riding factors in driving this pattern. Extensive phytoplankton blooms were a major feature of the surveys, and were persistent in areas such as South Georgia. In situ and bioassay measurements emphasised the important role of iron inputs as facilitators of these blooms. Based on seasonal DIC deficits, the South Georgia bloom was found to contain the strongest seasonal carbon uptake in the ice-free zone of the Southern Ocean. The surveys also encountered low-production, iron-limited regions, a situation more typical of the wider Southern Ocean. The response of primary and secondary consumers to spatial and temporal heterogeneity in production was complex. Many of the life-cycles of small pelagic organisms showed a close coupling to the seasonal cycle of food availability. For instance, Antarctic krill showed a dependence on early, non-ice-associated blooms to facilitate early reproduction. Strategies to buffer against environmental variability were also examined, such as the prevalence of multiyear life-cycles and variability in energy storage levels. Such traits were seen to influence the way in which Scotia Sea communities were structured, with biomass levels in the larger size classes being higher than in other ocean regions. Seasonal development also altered trophic function, with the trophic level of higher predators increasing through the course of the year as additional predator-prey interactions emerged in the lower trophic levels. Finally, our studies re-emphasised the role that the simple phytoplankton-krill-higher predator food chain plays in this Southern Ocean region, particularly south of the SACCF. To the north, alternative food chains, such as those involving copepods, macrozooplankton and mesopelagic fish, were increasingly important. Continued ocean warming in this region is likely to increase the prevalence of such alternative such food chains with Antarctic krill predicted to move southwards.