950 resultados para Refined nonlinear non-conforming triangular plate element


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effects of channel inequality on nonlinear signal switching in a nonlinear optical fiber loop mirror (NOLM) were investigated. It was found that the channel-to-channel amplitude differences in optical time division multiplexing (OTDM) have strong impact on swiching behavior of individual channels in a 2R regenerator. The optical pulses in different channels face either suppression of the amplitude noise or increase in noise, depending on the inter-channel amplitude difference. It was stated that appropriate control of the channel uniformity in the OTDM transmitters is required to support stable long-haul transmission in 2R regenerated systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Numerical techniques have been finding increasing use in all aspects of fracture mechanics, and often provide the only means for analyzing fracture problems. The work presented here, is concerned with the application of the finite element method to cracked structures. The present work was directed towards the establishment of a comprehensive two-dimensional finite element, linear elastic, fracture analysis package. Significant progress has been made to this end, and features which can now be studied include multi-crack tip mixed-mode problems, involving partial crack closure. The crack tip core element was refined and special local crack tip elements were employed to reduce the element density in the neighbourhood of the core region. The work builds upon experience gained by previous research workers and, as part of the general development, the program was modified to incorporate the eight-node isoparametric quadrilateral element. Also. a more flexible solving routine was developed, and provided a very compact method of solving large sets of simultaneous equations, stored in a segmented form. To complement the finite element analysis programs, an automatic mesh generation program has been developed, which enables complex problems. involving fine element detail, to be investigated with a minimum of input data. The scheme has proven to be versati Ie and reasonably easy to implement. Numerous examples are given to demonstrate the accuracy and flexibility of the finite element technique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this research was to investigate the integration of computer-aided drafting and finite-element analysis in a linked computer-aided design procedure and to develop the necessary software. The Be'zier surface patch for surface representation was used to bridge the gap between the rather separate fields of drafting and finite-element analysis because the surfaces are defined by analytical functions which allow systematic and controlled variation of the shape and provide continuous derivatives up to any required degree. The objectives of this research were achieved by establishing : (i) A package which interpretes the engineering drawings of plate and shell structures and prepares the Be'zier net necessary for surface representation. (ii) A general purpose stand-alone meshed-surface modelling package for surface representation of plates and shells using the Be'zier surface patch technique. (iii) A translator which adapts the geometric description of plate and shell structures as given by the meshed-surface modeller to the form needed by the finite-element analysis package. The translator was extended to suit fan impellers by taking advantage of their sectorial symmetry. The linking processes were carried out for simple test structures, simplified and actual fan impellers to verify the flexibility and usefulness of the linking technique adopted. Finite-element results for thin plate and shell structures showed excellent agreement with those obtained by other investigators while results for the simplified and actual fan impellers also showed good agreement with those obtained in an earlier investigation where finite-element analysis input data were manually prepared. Some extensions of this work have also been discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study expands the current knowledge base on the nature, causes and fate of unused medicines in primary care. Three methodologies were used and participants for each element were sampled from the population of Eastern Birmingham PCT. A detailed assessment was made of medicines returned to pharmacies and GP surgeries for destruction and a postal questionnaire covering medicines use and disposal was used to patients randomly selected from the electoral roll. The content of this questionnaire was informed by qualitative data from a group interview on the subject. By use of these three methods it was possible to triangulate the data, providing a comprehensive assessment of unused medicines. Unused medicines were found to be ubiquitous in primary care and cardiovascular, diabetic and respiratory medicines are unused in substantial quantities, accounting for a considerable proportion of the total financial value of all unused medicines. Additionally, analgesic and psychoactive medicines were highlighted as being unused in sufficient quantities for concern. Anti-infective medicines also appear to be present and unused in a substantial proportion of patients’ homes. Changes to prescribed therapy and non-compliance were identified as important factors leading to the generation of unused medicines. However, a wide array of other elements influence the quantities and types of medicines that are unused including the concordancy of GP consultations and medication reviews and patient factors such as age, sex or ethnicity. Medicines were appropriately discarded by 1 in 3 patients through return to a medical or pharmaceutical establishment. Inappropriate disposal was by placing in household refuse or through grey and black water with the possibility of hoarding or diversion also being identified. No correlations wre found between the weight of unused medicines and any clinical or financial factor. The study has highlighted unused medicines to be an issue of some concern and one that requires further study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For analysing financial time series two main opposing viewpoints exist, either capital markets are completely stochastic and therefore prices follow a random walk, or they are deterministic and consequently predictable. For each of these views a great variety of tools exist with which it can be tried to confirm the hypotheses. Unfortunately, these methods are not well suited for dealing with data characterised in part by both paradigms. This thesis investigates these two approaches in order to model the behaviour of financial time series. In the deterministic framework methods are used to characterise the dimensionality of embedded financial data. The stochastic approach includes here an estimation of the unconditioned and conditional return distributions using parametric, non- and semi-parametric density estimation techniques. Finally, it will be shown how elements from these two approaches could be combined to achieve a more realistic model for financial time series.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis applies a hierarchical latent trait model system to a large quantity of data. The motivation for it was lack of viable approaches to analyse High Throughput Screening datasets which maybe include thousands of data points with high dimensions. High Throughput Screening (HTS) is an important tool in the pharmaceutical industry for discovering leads which can be optimised and further developed into candidate drugs. Since the development of new robotic technologies, the ability to test the activities of compounds has considerably increased in recent years. Traditional methods, looking at tables and graphical plots for analysing relationships between measured activities and the structure of compounds, have not been feasible when facing a large HTS dataset. Instead, data visualisation provides a method for analysing such large datasets, especially with high dimensions. So far, a few visualisation techniques for drug design have been developed, but most of them just cope with several properties of compounds at one time. We believe that a latent variable model (LTM) with a non-linear mapping from the latent space to the data space is a preferred choice for visualising a complex high-dimensional data set. As a type of latent variable model, the latent trait model can deal with either continuous data or discrete data, which makes it particularly useful in this domain. In addition, with the aid of differential geometry, we can imagine the distribution of data from magnification factor and curvature plots. Rather than obtaining the useful information just from a single plot, a hierarchical LTM arranges a set of LTMs and their corresponding plots in a tree structure. We model the whole data set with a LTM at the top level, which is broken down into clusters at deeper levels of t.he hierarchy. In this manner, the refined visualisation plots can be displayed in deeper levels and sub-clusters may be found. Hierarchy of LTMs is trained using expectation-maximisation (EM) algorithm to maximise its likelihood with respect to the data sample. Training proceeds interactively in a recursive fashion (top-down). The user subjectively identifies interesting regions on the visualisation plot that they would like to model in a greater detail. At each stage of hierarchical LTM construction, the EM algorithm alternates between the E- and M-step. Another problem that can occur when visualising a large data set is that there may be significant overlaps of data clusters. It is very difficult for the user to judge where centres of regions of interest should be put. We address this problem by employing the minimum message length technique, which can help the user to decide the optimal structure of the model. In this thesis we also demonstrate the applicability of the hierarchy of latent trait models in the field of document data mining.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present thesis tested the hypothesis of Stanovich, Siegel, & Gottardo (1997) that surface dyslexia is the result of a milder phonological deficit than that seen in phonological dyslexia coupled with reduced reading experience. We found that a group of adults with surface dyslexia showed a phonological deficit that was commensurate with that shown by a group of adults with phonological dyslexia (matched for chronological age and verbal and non-verbal IQ) and normal reading experience. We also showed that surface dyslexia cannot be accounted for by a semantic impairment or a deficit in the verbal learning and recall of lexical-semantic information (such as meaningful words), as both dyslexic subgroups performed the same. This study has replicated the results of our published study that surface dyslexia is not the consequence of a mild retardation or reduced learning opportunities but a separate impairment linked to a deficit in written lexical learning, an ability needed to create novel lexical representations from a series of unrelated visual units, which is independent from the phonological deficit (Romani, Di Betta, Tsouknida & Olson, 2008). This thesis also provided evidence that a selective nonword reading deficit in developmental dyslexia persists beyond poor phonology. This was shown by finding a nonword reading deficit even in the presence of normal regularity effects in the dyslexics (when compared to both reading and spelling-age matched controls). A nonword reading deficit was also found in the surface dyslexics. Crucially, this deficit was as strong as in the phonological dyslexics despite better functioning of the sublexical route for the former. These results suggest that a nonword reading deficit cannot be solely explained by a phonological impairment. We, thus, suggested that nonword reading should also involve another ability relating to the processing of novel visual orthographic strings, which we called 'orthographic coding'. We then investigated the ability to process series of independent units within multi-element visual arrays and its relationship with reading and spelling problems. We identified a deficit in encoding the order of visual sequences (involving both linguistic and nonlinguistic information) which was significantly associated with word and nonword processing. More importantly, we revealed significant contributions to orthographic skills in both dyslexic and control individuals, even after age, performance IQ and phonological skills were controlled. These results suggest that spelling and reading do not only tap phonological skills but also order encoding skills.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We determine through numerical modelling the conditions for the generation of triangular-shaped optical pulses in a nonlinear, normally dispersive (ND) fibre and experimentally demonstrate triangular pulse formation in conventional ND fibre.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of new all-optical technologies for data processing and signal manipulation is a field of growing importance with a strong potential for numerous applications in diverse areas of modern science. Nonlinear phenomena occurring in optical fibres have many attractive features and great, but not yet fully explored, potential in signal processing. Here, we review recent progress on the use of fibre nonlinearities for the generation and shaping of optical pulses and on the applications of advanced pulse shapes in all-optical signal processing. Amongst other topics, we will discuss ultrahigh repetition rate pulse sources, the generation of parabolic shaped pulses in active and passive fibres, the generation of pulses with triangular temporal profiles, and coherent supercontinuum sources. The signal processing applications will span optical regeneration, linear distortion compensation, optical decision at the receiver in optical communication systems, spectral and temporal signal doubling, and frequency conversion. © Copyright 2012 Sonia Boscolo and Christophe Finot.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using a fiber laser system as a specific illustrative example, we introduce the concept of intermediate asymptotic states in finite nonlinear optical systems. We show that intermediate asymptotics of nonlinear equations (e.g., coherent structures with a finite lifetime or distance) can be used in applications similar to those of truly stable asymptotic solutions, such as, e.g., solitons and dissipative nonlinear waves. Applying this general idea to a particular, albeit practically important, physical system, we demonstrate a novel type of nonlinear pulse-shaping regime in a mode-locked fiber laser leading to the generation of linearly chirped pulses with a triangular distribution of the intensity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this scheme, nonlinearity and dispersion in the NDF lead to various reshaping processes of an initial, conventional pulse according to the chirping value and power level at the input of the fibre. In particular, we have observed that triangular-shaped pulses can be generated for sufficiently high energies and a positive initial chirp parameter. In our experiments, 2.8 ps-FWHM, transform-limited pulses generated from a mode-locked fibre laser source at a repetition rate of 1.25 GHz were pre-chirped by propagating the pulses through different lengths of standard mono-mode fibre. The chirped pulses were then amplified to different power levels before being launched into a 2.3 km section of True Wave fibre (TWF). The corresponding numerically calculated pulse temporal intensity profile and numerical and experimental second-harmonic generation frequency-resolved optical gating (SHG FROG) spectrograms were also derived. In conclusion, we have presented numerical modelling results which show the system design parameters required for the generation of triangular-shaped pulses in a nonlinear NDF, and experimentally demonstrated triangular pulse shaping in conventional NDF.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

All-optical technologies for data processing and signal manipulation are expected to play a major role in future optical communications. Nonlinear phenomena occurring in optical fibre have many attractive features and great, but not yet fully exploited potential in optical signal processing. Here, we overview our recent results and advances in developing novel photonic techniques and approaches to all-optical processing based on fibre nonlinearities. Amongst other topics, we will discuss phase-preserving optical 2R regeneration, the possibility of using parabolic/flat-top pulses for optical signal processing and regeneration, and nonlinear optical pulse shaping. A method for passive nonlinear pulse shaping based on pulse pre-chirping and propagation in a normally dispersive fibre will be presented. The approach provides a simple way of generating various temporal waveforms of fundamental and practical interest. Particular emphasis will be given to the formation and characterization of pulses with a triangular intensity profile. A new technique of doubling/copying optical pulses in both the frequency and time domains using triangular-shaped pulses will be also introduced.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The slope of the two-interval, forced-choice psychometric function (e.g. the Weibull parameter, ß) provides valuable information about the relationship between contrast sensitivity and signal strength. However, little is known about how or whether ß varies with stimulus parameters such as spatiotemporal frequency and stimulus size and shape. A second unresolved issue concerns the best way to estimate the slope of the psychometric function. For example, if an observer is non-stationary (e.g. their threshold drifts between experimental sessions), ß will be underestimated if curve fitting is performed after collapsing the data across experimental sessions. We measured psychometric functions for 2 experienced observers for 14 different spatiotemporal configurations of pulsed or flickering grating patches and bars on each of 8 days. We found ß ˜ 3 to be fairly constant across almost all conditions, consistent with a fixed nonlinear contrast transducer and/or a constant level of intrinsic stimulus uncertainty (e.g. a square law transducer and a low level of intrinsic uncertainty). Our analysis showed that estimating a single ß from results averaged over several experimental sessions was slightly more accurate than averaging multiple estimates from several experimental sessions. However, the small levels of non-stationarity (SD ˜ 0.8 dB) meant that the difference between the estimates was, in practice, negligible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate electronic mitigation of linear and non-linear fibre impairments and compare various digital signal processing techniques, including electronic dispersion compensation (EDC), single-channel back-propagation (SC-BP) and back-propagation with multiple channel processing (MC-BP) in a nine-channel 112 Gb/s PM-mQAM (m=4,16) WDM system, for reaches up to 6,320 km. We show that, for a sufficiently high local dispersion, SC-BP is sufficient to provide a significant performance enhancement when compared to EDC, and is adequate to achieve BER below FEC threshold. For these conditions we report that a sampling rate of two samples per symbol is sufficient for practical SC-BP, without significant penalties.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The standard reference clinical score quantifying average Parkinson's disease (PD) symptom severity is the Unified Parkinson's Disease Rating Scale (UPDRS). At present, UPDRS is determined by the subjective clinical evaluation of the patient's ability to adequately cope with a range of tasks. In this study, we extend recent findings that UPDRS can be objectively assessed to clinically useful accuracy using simple, self-administered speech tests, without requiring the patient's physical presence in the clinic. We apply a wide range of known speech signal processing algorithms to a large database (approx. 6000 recordings from 42 PD patients, recruited to a six-month, multi-centre trial) and propose a number of novel, nonlinear signal processing algorithms which reveal pathological characteristics in PD more accurately than existing approaches. Robust feature selection algorithms select the optimal subset of these algorithms, which is fed into non-parametric regression and classification algorithms, mapping the signal processing algorithm outputs to UPDRS. We demonstrate rapid, accurate replication of the UPDRS assessment with clinically useful accuracy (about 2 UPDRS points difference from the clinicians' estimates, p < 0.001). This study supports the viability of frequent, remote, cost-effective, objective, accurate UPDRS telemonitoring based on self-administered speech tests. This technology could facilitate large-scale clinical trials into novel PD treatments.