12 resultados para digital time with memory
em Aston University Research Archive
Resumo:
We introduce a discrete-time fibre channel model that provides an accurate analytical description of signal-signal and signal-noise interference with memory defined by the interplay of nonlinearity and dispersion. Also the conditional pdf of signal distortion, which captures non-circular complex multivariate symbol interactions, is derived providing the necessary platform for the analysis of channel statistics and capacity estimations in fibre optic links.
Resumo:
This paper describes how dimensional variation management could be integrated throughout design, manufacture and verification, to improve quality while reducing cycle times and manufacturing cost in the Digital Factory environment. Initially variation analysis is used to optimize tolerances during product and tooling design and also results in the creation of a simplified representation of product key characteristics. This simplified representation can then be used to carry out measurability analysis and process simulation. The link established between the variation analysis model and measurement processes can subsequently be used throughout the production process to automatically update the variation analysis model in real time with measurement data. This ‘live’ simulation of variation during manufacture will allow early detection of quality issues and facilitate autonomous measurement assisted processes such as predictive shimming. A study is described showing how these principles can be demonstrated using commercially available software combined with a number of prototype applications operating as discrete modules. The commercially available modules include Catia/Delmia for product and process design, 3DCS for variation analysis and Spatial Analyzer for measurement simulation. Prototype modules are used to carry out measurability analysis and instrument selection. Realizing the full potential of Metrology in the Digital Factory will require that these modules are integrated and software architecture to facilitate this is described. Crucially this integration must facilitate the use of realtime metrology data describing the emerging assembly to update the digital model.
Resumo:
Exploratory analysis of data in all sciences seeks to find common patterns to gain insights into the structure and distribution of the data. Typically visualisation methods like principal components analysis are used but these methods are not easily able to deal with missing data nor can they capture non-linear structure in the data. One approach to discovering complex, non-linear structure in the data is through the use of linked plots, or brushing, while ignoring the missing data. In this technical report we discuss a complementary approach based on a non-linear probabilistic model. The generative topographic mapping enables the visualisation of the effects of very many variables on a single plot, which is able to incorporate far more structure than a two dimensional principal components plot could, and deal at the same time with missing data. We show that using the generative topographic mapping provides us with an optimal method to explore the data while being able to replace missing values in a dataset, particularly where a large proportion of the data is missing.
Resumo:
A three-node optical time-division multiplexing (OTDM) network is demonstrated that utilizes electroabsorption (EA) modulators as the core elements. Each node is self contained and performs its own clock recovery and synchronization. “Drop and insert” functionality is demonstrated for the first time with an EA modulator by completely removing a 10-Gb/s channel from a 40-Gb/s OTDM data stream. A different 10-Gb/s channel was subsequently inserted into the vacant time slot. Clock recovery is achieved by using an EA modulator in a novel bidirectional configuration. Bit-error-rate (BER) measurements are presented for each of the 10-Gb/s OTDM channels.
Resumo:
Recently underwater sensor networks (UWSN) attracted large research interests. Medium access control (MAC) is one of the major challenges faced by UWSN due to the large propagation delay and narrow channel bandwidth of acoustic communications used for UWSN. Widely used slotted aloha (S-Aloha) protocol suffers large performance loss in UWSNs, which can only achieve performance close to pure aloha (P-Aloha). In this paper we theoretically model the performances of S-Aloha and P-Aloha protocols and analyze the adverse impact of propagation delay. According to the observation on the performances of S-Aloha protocol we propose two enhanced S-Aloha protocols in order to minimize the adverse impact of propagation delay on S-Aloha protocol. The first enhancement is a synchronized arrival S-Aloha (SA-Aloha) protocol, in which frames are transmitted at carefully calculated time to align the frame arrival time with the start of time slots. Propagation delay is taken into consideration in the calculation of transmit time. As estimation error on propagation delay may exist and can affect network performance, an improved SA-Aloha (denoted by ISA-Aloha) is proposed, which adjusts the slot size according to the range of delay estimation errors. Simulation results show that both SA-Aloha and ISA-Aloha perform remarkably better than S-Aloha and P-Aloha for UWSN, and ISA-Aloha is more robust even when the propagation delay estimation error is large. © 2011 IEEE.
Resumo:
Judith Hermann's works have attracted considerable criticism for their supposedly slight portrayal of passively drifting characters and for their alleged failure to engage with the socio-political realities of contemporary life in the Berlin Republic. Only very recently have scholars paid attention to the hidden concern with memory expressed in her books, and have set out to examine their intertextual depth. This paper explores these previously neglected historical references in Summerhouse, later and analyses the book's intricate intertextual allusions with specific reference to Theodor Fontane's works. It examines how the tentative existence, which Hermann's characters experience, is the product of a hesitant and fruitless confrontation with questions of German history and nationhood. Using pervasive water imagery, Hermann shows present-day Germany as a continually contested territory with a fluid identity shaped by an abundance of conflicting narratives. In this context, the allusions to Fontane as a representative of the Wilhelminian period serve as references to a continuing German tradition of repression and marginalisation. At the same time, Hermann recognises Fontane's ambivalent political stance combining elements of social criticism with a general endorsement of social order. Ultimately, the seemingly indifferent attitude of Hermann's characters and the elegiac style used to portray them, emerge as a distancing mechanism that functions as a postmodern variant of Fontane's irony and is shaped by a similar sense of skepticism towards developments in German society and national history. © 2012 Springer Science+Business Media B.V.
Resumo:
Uncertainty can be defined as the difference between information that is represented in an executing system and the information that is both measurable and available about the system at a certain point in its life-time. A software system can be exposed to multiple sources of uncertainty produced by, for example, ambiguous requirements and unpredictable execution environments. A runtime model is a dynamic knowledge base that abstracts useful information about the system, its operational context and the extent to which the system meets its stakeholders' needs. A software system can successfully operate in multiple dynamic contexts by using runtime models that augment information available at design-time with information monitored at runtime. This chapter explores the role of runtime models as a means to cope with uncertainty. To this end, we introduce a well-suited terminology about models, runtime models and uncertainty and present a state-of-the-art summary on model-based techniques for addressing uncertainty both at development- and runtime. Using a case study about robot systems we discuss how current techniques and the MAPE-K loop can be used together to tackle uncertainty. Furthermore, we propose possible extensions of the MAPE-K loop architecture with runtime models to further handle uncertainty at runtime. The chapter concludes by identifying key challenges, and enabling technologies for using runtime models to address uncertainty, and also identifies closely related research communities that can foster ideas for resolving the challenges raised. © 2014 Springer International Publishing.
Resumo:
A three-node optical time-division multiplexing (OTDM) network is demonstrated that utilizes electroabsorption (EA) modulators as the core elements. Each node is self contained and performs its own clock recovery and synchronization. "Drop and insert" functionality is demonstrated for the first time with an EA modulator by completely removing a 10-Gb/s channel from a 40-Gb/s OTDM data stream. A different 10-Gb/s channel was subsequently inserted into the vacant time slot. Clock recovery is achieved by using an EA modulator in a novel bidirectional configuration. Bit-error-rate (BER) measurements are presented for each of the 10-Gb/s OTDM channels.
Resumo:
Purpose. The purpose of this study was to evaluate the longitudinal changes in ocular physiology, tear film characteristics, and symptomatology experienced by neophyte silicone hydrogel (SiH) contact lens wearers in a daily-wear compared with a continuous-wear modality and with the different commercially available lenses over an 18-month period. Methods. Forty-five neophyte subjects were enrolled in the study and randomly assigned to wear one of two SiH materials: lotrafilcon A or balafilcon A lenses on either a daily- (LDW; BDW) or continuous-wear (LCW; BCW) basis. Additionally, a group of noncontact lens-wearing subjects (control group) was also recruited and followed over the same study period. Objective and subjective grading of ocular physiology were carried out together with tear meniscus height (TMH) and noninvasive tear breakup time (NITBUT). Subjects also subjectively rated symptoms and judgments with lens wear. After initial screening, subsequent measurements were taken after 1, 3, 6, 12, and 18 months. Results. Subjective and objective grading of ocular physiology revealed a small increase in bulbar, limbal, and palpebral hyperemia as well as corneal staining over time with both lens materials and regimes of wear (p < 0.05). No significant changes in NITBUT or TMH were found (p > 0.05). Subjective symptoms and judgment were not material- or modality-specific. Conclusions. Daily and continuous wear of SiH contact lenses induced small but statistically significant changes in ocular physiology and symptomatology. Clinical measures of tear film characteristics were unaffected by lens wear. Both materials and regimes of wear showed similar clinical performance. Long-term SiH contact lens wear is shown to be a successful option for patients. Copyright © 2006 American Academy of Optometry.
Resumo:
A time dependent electromagnetic pulse generated by a current running laterally to the direction of the pulse propagation is considered in paraxial approximation. It is shown that the pulse envelope moves in the time-spatial coordinates on the surface of a parabolic cylinder for the Airy pulse and a hyperbolic cylinder for the Gaussian. These pulses propagate in time with deceleration along the dominant propagation direction and drift uniformly in the lateral direction. The Airy pulse stops at infinity while the asymptotic velocity of the Gaussian is nonzero. © 2013 Optical Society of America.
Resumo:
In this paper we evaluate and compare two representativeand popular distributed processing engines for large scalebig data analytics, Spark and graph based engine GraphLab. Wedesign a benchmark suite including representative algorithmsand datasets to compare the performances of the computingengines, from performance aspects of running time, memory andCPU usage, network and I/O overhead. The benchmark suite istested on both local computer cluster and virtual machines oncloud. By varying the number of computers and memory weexamine the scalability of the computing engines with increasingcomputing resources (such as CPU and memory). We also runcross-evaluation of generic and graph based analytic algorithmsover graph processing and generic platforms to identify thepotential performance degradation if only one processing engineis available. It is observed that both computing engines showgood scalability with increase of computing resources. WhileGraphLab largely outperforms Spark for graph algorithms, ithas close running time performance as Spark for non-graphalgorithms. Additionally the running time with Spark for graphalgorithms over cloud virtual machines is observed to increaseby almost 100% compared to over local computer clusters.