28 resultados para Tilted-time window model

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ultra-high power (exceeding the self-focusing threshold by more than three orders of magnitude) light beams from ground-based laser systems may find applications in space-debris cleaning. The propagation of such powerful laser beams through the atmosphere reveals many novel interesting features compared to traditional light self-focusing. It is demonstrated here that for the relevant laser parameters, when the thickness of the atmosphere is much shorter than the focusing length (that is, of the orbit scale), the beam transit through the atmosphere in lowest order produces phase distortion only. This means that by using adaptive optics it may be possible to eliminate the impact of self-focusing in the atmosphere on the laser beam. The area of applicability of the proposed "thin window" model is broader than the specific physical problem considered here. For instance, it might find applications in femtosecond laser material processing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Signal integration determines cell fate on the cellular level, affects cognitive processes and affective responses on the behavioural level, and is likely to be involved in psychoneurobiological processes underlying mood disorders. Interactions between stimuli may subjected to time effects. Time-dependencies of interactions between stimuli typically lead to complex cell responses and complex responses on the behavioural level. We show that both three-factor models and time series models can be used to uncover such time-dependencies. However, we argue that for short longitudinal data the three factor modelling approach is more suitable. In order to illustrate both approaches, we re-analysed previously published short longitudinal data sets. We found that in human embryonic kidney 293 cells cells the interaction effect in the regulation of extracellular signal-regulated kinase (ERK) 1 signalling activation by insulin and epidermal growth factor is subjected to a time effect and dramatically decays at peak values of ERK activation. In contrast, we found that the interaction effect induced by hypoxia and tumour necrosis factor-alpha for the transcriptional activity of the human cyclo-oxygenase-2 promoter in HEK293 cells is time invariant at least in the first 12-h time window after stimulation. Furthermore, we applied the three-factor model to previously reported animal studies. In these studies, memory storage was found to be subjected to an interaction effect of the beta-adrenoceptor agonist clenbuterol and certain antagonists acting on the alpha-1-adrenoceptor / glucocorticoid-receptor system. Our model-based analysis suggests that only if the antagonist drug is administer in a critical time window, then the interaction effect is relevant.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When people monitor a visual stream of rapidly presented stimuli for two targets (T1 and T2), they often miss T2 if it falls into a time window of about half a second after T1 onset—the attentional blink (AB). We provide an overview of recent neuroscientific studies devoted to analyze the neural processes underlying the AB and their temporal dynamics. The available evidence points to an attentional network involving temporal, right-parietal and frontal cortex, and suggests that the components of this neural network interact by means of synchronization and stimulus-induced desynchronization in the beta frequency range. We set up a neurocognitive scenario describing how the AB might emerge and why it depends on the presence of masks and the other event(s) the targets are embedded in. The scenario supports the idea that the AB arises from ‘‘biased competition’’, with the top–down bias being generated by parietal–frontal interactions and the competition taking place between stimulus codes in temporal cortex.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is concerned with approximate inference in dynamical systems, from a variational Bayesian perspective. When modelling real world dynamical systems, stochastic differential equations appear as a natural choice, mainly because of their ability to model the noise of the system by adding a variant of some stochastic process to the deterministic dynamics. Hence, inference in such processes has drawn much attention. Here two new extended frameworks are derived and presented that are based on basis function expansions and local polynomial approximations of a recently proposed variational Bayesian algorithm. It is shown that the new extensions converge to the original variational algorithm and can be used for state estimation (smoothing). However, the main focus is on estimating the (hyper-) parameters of these systems (i.e. drift parameters and diffusion coefficients). The new methods are numerically validated on a range of different systems which vary in dimensionality and non-linearity. These are the Ornstein-Uhlenbeck process, for which the exact likelihood can be computed analytically, the univariate and highly non-linear, stochastic double well and the multivariate chaotic stochastic Lorenz '63 (3-dimensional model). The algorithms are also applied to the 40 dimensional stochastic Lorenz '96 system. In this investigation these new approaches are compared with a variety of other well known methods such as the ensemble Kalman filter / smoother, a hybrid Monte Carlo sampler, the dual unscented Kalman filter (for jointly estimating the systems states and model parameters) and full weak-constraint 4D-Var. Empirical analysis of their asymptotic behaviour as a function of observation density or length of time window increases is provided.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work is concerned with approximate inference in dynamical systems, from a variational Bayesian perspective. When modelling real world dynamical systems, stochastic differential equations appear as a natural choice, mainly because of their ability to model the noise of the system by adding a variation of some stochastic process to the deterministic dynamics. Hence, inference in such processes has drawn much attention. Here a new extended framework is derived that is based on a local polynomial approximation of a recently proposed variational Bayesian algorithm. The paper begins by showing that the new extension of this variational algorithm can be used for state estimation (smoothing) and converges to the original algorithm. However, the main focus is on estimating the (hyper-) parameters of these systems (i.e. drift parameters and diffusion coefficients). The new approach is validated on a range of different systems which vary in dimensionality and non-linearity. These are the Ornstein–Uhlenbeck process, the exact likelihood of which can be computed analytically, the univariate and highly non-linear, stochastic double well and the multivariate chaotic stochastic Lorenz ’63 (3D model). As a special case the algorithm is also applied to the 40 dimensional stochastic Lorenz ’96 system. In our investigation we compare this new approach with a variety of other well known methods, such as the hybrid Monte Carlo, dual unscented Kalman filter, full weak-constraint 4D-Var algorithm and analyse empirically their asymptotic behaviour as a function of observation density or length of time window increases. In particular we show that we are able to estimate parameters in both the drift (deterministic) and the diffusion (stochastic) part of the model evolution equations using our new methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The generation of very short range forecasts of precipitation in the 0-6 h time window is traditionally referred to as nowcasting. Most existing nowcasting systems essentially extrapolate radar observations in some manner, however, very few systems account for the uncertainties involved. Thus deterministic forecast are produced, which have a limited use when decisions must be made, since they have no measure of confidence or spread of the forecast. This paper develops a Bayesian state space modelling framework for quantitative precipitation nowcasting which is probabilistic from conception. The model treats the observations (radar) as noisy realisations of the underlying true precipitation process, recognising that this process can never be completely known, and thus must be represented probabilistically. In the model presented here the dynamics of the precipitation are dominated by advection, so this is a probabilistic extrapolation forecast. The model is designed in such a way as to minimise the computational burden, while maintaining a full, joint representation of the probability density function of the precipitation process. The update and evolution equations avoid the need to sample, thus only one model needs be run as opposed to the more traditional ensemble route. It is shown that the model works well on both simulated and real data, but that further work is required before the model can be used operationally. © 2004 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In construction projects, the aim of project control is to ensure projects finish on time, within budget, and achieve other project objectives. During the last few decades, numerous project control methods have been developed and adopted by project managers in practice. However, many existing methods focus on describing what the processes and tasks of project control are; not on how these tasks should be conducted. There is also a potential gap between principles that underly these methods and project control practice. As a result, time and cost overruns are still common in construction projects, partly attributable to deficiencies of existing project control methods and difficulties in implementing them. This paper describes a new project cost and time control model, the project control and inhibiting factors management (PCIM) model, developed through a study involving extensive interaction with construction practitioners in the UK, which better reflects the real needs of project managers. A set of good practice checklist is also developed to facilitate implementation of the model. © 2013 American Society of Civil Engineers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Self-adaptation enables software systems to respond to changing environmental contexts that may not be fully understood at design time. Designing a dynamically adaptive system (DAS) to cope with this uncertainty is challenging, as it is impractical during requirements analysis and design time to anticipate every environmental condition that the DAS may encounter. Previously, the RELAX language was proposed to make requirements more tolerant to environmental uncertainty, and Claims were applied as markers of uncertainty that document how design assumptions affect goals. This paper integrates these two techniques in order to assess the validity of Claims at run time while tolerating minor and unanticipated environmental conditions that can trigger adaptations. We apply the proposed approach to the dynamic reconfiguration of a remote data mirroring network that must diffuse data while minimizing costs and exposure to data loss. Results show RELAXing Claims enables a DAS to reduce adaptation costs. © 2012 Springer-Verlag.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: To examine the optimum time at which fluorescein patterns of gas permeable lenses (GPs) should be evaluated. METHODS: Aligned, 0.2mm steep and 0.2mm flat GPs were fitted to 17 patients (aged 20.6±1.1 years, 10 male). Fluorescein was applied to their upper temporal bulbar conjunctiva with a moistened fluorescein strip. Digital slit lamp images (CSO, Italy) at 10× magnification of the fluorescein pattern viewed with blue light through a yellow filter were captured every 15s. Fluorescein intensity in central, mid peripheral and edge regions of the superior, inferior, temporal and nasal quadrants of the lens were graded subjectively using a +2 to -2 scale and using ImageJ software on the simultaneously captured images. RESULTS: Subjectively graded and objectively image analysed fluorescein intensity changed with time (p<0.001), lens region (centre, mid-periphery and edge: p<0.05) and there was interaction between lens region with lens fit (p<0.001). For edge band width, there was a significant effect of time (F=118.503, p<0.001) and lens fit (F=5.1249, p=0.012). The expected alignment, flat and steep fitting patterns could be seen from approximately after 30 to 180s subjectively and 15 to 105s in captured images. CONCLUSION: Although the stability of fluorescein intensity can start to decline in as little as 45s post fluorescein instillation, the diagnostic pattern of alignment, steep or flat fit is seen in each meridian by subjective observation from about 30s to 3min indicating this is the most appropriate time window to evaluate GP lenses in clinical practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We examine how the most prevalent stochastic properties of key financial time series have been affected during the recent financial crises. In particular we focus on changes associated with the remarkable economic events of the last two decades in the volatility dynamics, including the underlying volatility persistence and volatility spillover structure. Using daily data from several key stock market indices, the results of our bivariate GARCH models show the existence of time varying correlations as well as time varying shock and volatility spillovers between the returns of FTSE and DAX, and those of NIKKEI and Hang Seng, which became more prominent during the recent financial crisis. Our theoretical considerations on the time varying model which provides the platform upon which we integrate our multifaceted empirical approaches are also of independent interest. In particular, we provide the general solution for time varying asymmetric GARCH specifications, which is a long standing research topic. This enables us to characterize these models by deriving, first, their multistep ahead predictors, second, the first two time varying unconditional moments, and third, their covariance structure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present study investigates the feasibility of using two types of carbomer (971 and 974) to prepare inhalable dry powders that exhibit modified drug release properties. Powders were prepared by spray-drying formulations containing salbutamol sulphate, 20-50% w/w carbomer as a drug release modifier and leucine as an aerosolization enhancer. Following physical characterization of the powders, the aerosolization and dissolution properties of the powders were investigated using a Multi-Stage Liquid Impinger and a modified USP II dissolution apparatus, respectively. All carbomer 974-modified powders and the 20% carbomer 971 powder demonstrated high dispersibility, with emitted doses of at least 80% and fine particle fractions of approximately 40%. The release data indicated that all carbomer-modified powders displayed a sustained release profile, with carbomer 971-modified powders obeying first order kinetics, whereas carbomer 974-modified powders obeyed the Higuchi root time kinetic model; increasing the amount of carbomer 971 in the formulation did not extend the duration of drug release, whereas this was observed for the carbomer 974-modified powders. These powders would be anticipated to deposit predominately in the lower regions of the lung following inhalation and then undergo delayed rather than instantaneous drug release, offering the potential to reduce dosing frequency and improve patient compliance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An increasing number of neuroimaging studies are concerned with the identification of interactions or statistical dependencies between brain areas. Dependencies between the activities of different brain regions can be quantified with functional connectivity measures such as the cross-correlation coefficient. An important factor limiting the accuracy of such measures is the amount of empirical data available. For event-related protocols, the amount of data also affects the temporal resolution of the analysis. We use analytical expressions to calculate the amount of empirical data needed to establish whether a certain level of dependency is significant when the time series are autocorrelated, as is the case for biological signals. These analytical results are then contrasted with estimates from simulations based on real data recorded with magnetoencephalography during a resting-state paradigm and during the presentation of visual stimuli. Results indicate that, for broadband signals, 50-100 s of data is required to detect a true underlying cross-correlations coefficient of 0.05. This corresponds to a resolution of a few hundred milliseconds for typical event-related recordings. The required time window increases for narrow band signals as frequency decreases. For instance, approximately 3 times as much data is necessary for signals in the alpha band. Important implications can be derived for the design and interpretation of experiments to characterize weak interactions, which are potentially important for brain processing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work reports the developnent of a mathenatical model and distributed, multi variable computer-control for a pilot plant double-effect climbing-film evaporator. A distributed-parameter model of the plant has been developed and the time-domain model transformed into the Laplace domain. The model has been further transformed into an integral domain conforming to an algebraic ring of polynomials, to eliminate the transcendental terms which arise in the Laplace domain due to the distributed nature of the plant model. This has made possible the application of linear control theories to a set of linear-partial differential equations. The models obtained have well tracked the experimental results of the plant. A distributed-computer network has been interfaced with the plant to implement digital controllers in a hierarchical structure. A modern rnultivariable Wiener-Hopf controller has been applled to the plant model. The application has revealed a limitation condition that the plant matrix should be positive-definite along the infinite frequency axis. A new multi variable control theory has emerged fram this study, which avoids the above limitation. The controller has the structure of the modern Wiener-Hopf controller, but with a unique feature enabling a designer to specify the closed-loop poles in advance and to shape the sensitivity matrix as required. In this way, the method treats directly the interaction problems found in the chemical processes with good tracking and regulation performances. Though the ability of the analytical design methods to determine once and for all whether a given set of specifications can be met is one of its chief advantages over the conventional trial-and-error design procedures. However, one disadvantage that offsets to some degree the enormous advantages is the relatively complicated algebra that must be employed in working out all but the simplest problem. Mathematical algorithms and computer software have been developed to treat some of the mathematical operations defined over the integral domain, such as matrix fraction description, spectral factorization, the Bezout identity, and the general manipulation of polynomial matrices. Hence, the design problems of Wiener-Hopf type of controllers and other similar algebraic design methods can be easily solved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The recognition of faces and of facial expressions in an important evolutionary skill, and an integral part of social communication. It has been argued that the processing of faces is distinct from the processing of non-face stimuli and functional neuroimaging investigations have even found evidence of a distinction between the perception of faces and of emotional expressions. Structural and temporal correlates of face perception and facial affect have only been separately identified. Investigation neural dynamics of face perception per se as well as facial affect would allow the mapping of these in space, time and frequency specific domains. Participants were asked to perform face categorisation and emotional discrimination tasks and Magnetoencephalography (MEG) was used to measure the neurophysiology of face and facial emotion processing. SAM analysis techniques enable the investigation of spectral changes within specific time-windows and frequency bands, thus allowing the identification of stimulus specific regions of cortical power changes. Furthermore, MEG’s excellent temporal resolution allows for the detection of subtle changes associated with the processing of face and non-face stimuli and different emotional expressions. The data presented reveal that face perception is associated with spectral power changes within a distributed cortical network comprising occipito-temporal as well as parietal and frontal areas. For the perception of facial affect, spectral power changes were also observed within frontal and limbic areas including the parahippocampal gyrus and the amygdala. Analyses of temporal correlates also reveal a distinction between the processing of faces and facial affect. Face perception per se occurred at earlier latencies whereas the discrimination of facial expression occurred within a longer time-window. In addition, the processing of faces and facial affect was differentially associated with changes in cortical oscillatory power for alpha, beta and gamma frequencies. The perception of faces and facial affect is associated with distinct changes in cortical oscillatory activity that can be mapped to specific neural structures, specific time-windows and latencies as well as specific frequency bands. Therefore, the work presented in this thesis provides further insight into the sequential processing of faces and facial affect.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis examines and explains the development of occupational exposure limits (OELs) as a means of preventing work related disease and ill health. The research focuses on the USA and UK and sets the work within a certain historical and social context. A subsidiary aim of the thesis is to identify any short comings in OELs and the methods by which they are set and suggest alternatives. The research framework uses Thomas Kuhn's idea of science progressing by means of paradigms which he describes at one point, `lq ... universally recognised scientific achievements that for a time provide model problems and solutions to a community of practitioners. KUHN (1970). Once learned individuals in the community, `lq ... are committed to the same rules and standards for scientific practice. Ibid. Kuhn's ideas are adapted by combining them with a view of industrial hygiene as an applied science-based profession having many of the qualities of non-scientific professions. The great advantage of this approach to OELs is that it keeps the analysis grounded in the behaviour and priorities of the groups which have forged, propounded, used, benefited from, and defended, them. The development and use of OELs on a larger scale is shown to be connected to the growth of a new profession in the USA; industrial hygiene, with the assistance of another new profession; industrial toxicology. The origins of these professions, particularly industrial hygiene, are traced. By examining the growth of the professions and the writings of key individuals it is possible to show how technical, economic and social factors became embedded in the OEL paradigm which industrial hygienists and toxicologists forged. The origin, mission and needs of these professions and their clients made such influences almost inevitable. The use of the OEL paradigm in practice is examined by an analysis of the process of the American Conference of Governmental Industrial Hygienists, Threshold Limit Value (ACGIH, TLV) Committee via the Minutes from 1962-1984. A similar approach is taken with the development of OELs in the UK. Although the form and definition of TLVs has encouraged the belief that they are health-based OELs the conclusion is that they, and most other OELs, are, and always have been, reasonably practicable limits: the degree of risk posed by a substance is weighed against the feasibility and cost of controlling exposure to that substance. The confusion over the status of TLVs and other OELs is seen to be a confusion at the heart of the OEL paradigm and the historical perspective explains why this should be. The paradigm has prevented the creation of truly health-based and, conversely, truly reasonably practicable OELs. In the final part of the thesis the analysis of the development of OELs is set in a contemporary context and a proposal for a two-stage, two-committee procedure for producing sets of OELs is put forward. This approach is set within an alternative OEL paradigm. The advantages, benefits and likely obstacles to these proposals are discussed.