940 resultados para local-to-zero analysis
Resumo:
We consider return-to-zero (RZ) pulses with random phase modulation propagating in a nonlinear channel (modelled by the integrable nonlinear Schrödinger equation, NLSE). We suggest two different models for the phase fluctuations of the optical field: (i) Gaussian short-correlated fluctuations and (ii) generalized telegraph process. Using the rectangular-shaped pulse form we demonstrate that the presence of phase fluctuations of both types strongly influences the number of solitons generated in the channel. It is also shown that increasing the correlation time for the random phase fluctuations affects the coherent content of a pulse in a non-trivial way. The result obtained has potential consequences for all-optical processing and design of optical decision elements.
Resumo:
The thesis investigates the value of quantitative analyses for historical studies of science through an examination of research trends in insect pest control, or economic entomology. Reviews are made of quantitative studies of science, and historical studies of pest control. The methodological strengths and weaknesses of bibliometric techniques are examined in a special chapter; techniques examined include productivity studies such as paper counts, and relational techniques such as co-citation and co-word analysis. Insect pest control is described. This includes a discussion of the socio-economic basis of the concept of `pest'; a series of classifications of pest control techniques are provided and analysed with respect to their utility for scientometric studies. The chemical and biological approaches to control are discussed as scientific and technological paradigms. Three case studies of research trends in economic entomology are provided. First a scientometric analysis of samples of chemical control and biological control papers; providing quantitative data on institutional, financial, national, and journal structures associated with pest control research fields. Second, a content analysis of a core journal, the Journal of Economic Entomology, over a period of 1910-1985; this identifies the main research innovations and trends, in particular the changing balance between chemical and biological control. Third, an analysis of historical research trends in insecticide research; this shows the rise, maturity and decline of research of many groups of compounds. These are supplemented by a collection of seven papers on scientometric studies of pest control and quantitative techniques for analysing science.
Resumo:
The trend in modal extraction algorithms is to use all the available frequency response functions data to obtain a global estimate of the natural frequencies, damping ratio and mode shapes. Improvements in transducer and signal processing technology allow the simultaneous measurement of many hundreds of channels of response data. The quantity of data available and the complexity of the extraction algorithms make considerable demands on the available computer power and require a powerful computer or dedicated workstation to perform satisfactorily. An alternative to waiting for faster sequential processors is to implement the algorithm in parallel, for example on a network of Transputers. Parallel architectures are a cost effective means of increasing computational power, and a larger number of response channels would simply require more processors. This thesis considers how two typical modal extraction algorithms, the Rational Fraction Polynomial method and the Ibrahim Time Domain method, may be implemented on a network of transputers. The Rational Fraction Polynomial Method is a well known and robust frequency domain 'curve fitting' algorithm. The Ibrahim Time Domain method is an efficient algorithm that 'curve fits' in the time domain. This thesis reviews the algorithms, considers the problems involved in a parallel implementation, and shows how they were implemented on a real Transputer network.
Resumo:
A method of all-optical passive quasi-regeneration in transoceanic 40 Gbit/s return-to-zero transmission systems with strong dispersion management was described. The use of in-line nonlinear optical loop mirrors (NOLM) by the method was demonstrated. The quasi-regeneration of signals performed by NOLMs was found to improve the systems's performance.
Resumo:
In the present state of the art of authorship attribution there seems to be an opposition between two approaches: cognitive and stylistic methodologies. It is proposed in this article that these two approaches are complementary and that the apparent gap between them can be bridged using Systemic Functional Linguistics (SFL) and in particular some of its theoretical constructions, such as codal variation. This article deals with the theoretical explanation of why such a theory would solve the debate between the two approaches and shows how these two views of authorship attribution are indeed complementary. Although the article is fundamentally theoretical, two example experimental trials are reported to show how this theory can be developed into a workable methodology of doing authorship attribution. In Trial 1, a SFL analysis was carried out on a small dataset consisting of three 300-word texts collected from three different authors whose socio-demographic background matched across a number of parameters. This trial led to some conclusions about developing a methodology based on SFL and suggested the development of another trial, which might hint at a more accurate and useful methodology. In Trial 2, Biber's (1988) multidimensional framework is employed, and a final methodology of authorship analysis based on this kind of analysis is proposed for future research. © 2013, EQUINOX PUBLISHING.
Resumo:
The paper presents an approach to extraction of facts from texts of documents. This approach is based on using knowledge about the subject domain, specialized dictionary and the schemes of facts that describe fact structures taking into consideration both semantic and syntactic compatibility of elements of facts. Actually extracted facts combine into one structure the dictionary lexical objects found in the text and match them against concepts of subject domain ontology.
Resumo:
The authors present the impact of asymmetric filtering of strong (e.g. 35 GHz) optical filters on the performance of 42.7 Gb/s 67% (carrier suppressed return to zero)-differential phase shift keying systems. The performance is examined (in an amplified spontaneous emission (ASE) noise-limited regime and in the presence of chromatic dispersion) when offsetting the filter at the receiver by substantial amounts via balanced, constructive and destructive single-ended detections. It is found that with a slight offset (vestigial side band) or an offset of almost half of the modulation frequency (single-side band), there is a significant improvement in the calculated 'Q'. © The Institution of Engineering and Technology 2013.
Resumo:
The aim of this study is to evaluate the application of ensemble averaging to the analysis of electromyography recordings under whole body vibratory stimulation. Recordings from Rectus Femoris, collected during vibratory stimulation at different frequencies, are used. Each signal is subdivided in intervals, which time duration is related to the vibration frequency. Finally the average of the segmented intervals is performed. By using this method for the majority of the recordings the periodic components emerge. The autocorrelation of few seconds of signals confirms the presence of a pseudosinusoidal components strictly related to the soft tissues oscillations caused by the mechanical waves. © 2014 IEEE.
Resumo:
A novel multichannel carrier-suppressed return-to-zero (CSRZ) to non-return-to-zero (NRZ) format conversion scheme based on a single custom-designed fiber Bragg grating (FBG) with comb spectra is proposed. The spectral response of each channel is designed according to the algebraic difference between the CSRZ and NRZ spectra outlines. The tailored group delays are introduced to minimize the maximum refractive index modulation. Numerical results show that four-channel 200-GHz-spaced CSRZ signals at 40 Gbits/s can be converted into NRZ signals with high Q-factor and wide-range robustness. It is shown that our proposed FBG is robust to deviations of bandwidth and central wavelength detuning. Another important merit of this scheme is that the pattern effects are efficiently reduced owing to the well-designed spectra response.
Resumo:
With the latest development in computer science, multivariate data analysis methods became increasingly popular among economists. Pattern recognition in complex economic data and empirical model construction can be more straightforward with proper application of modern softwares. However, despite the appealing simplicity of some popular software packages, the interpretation of data analysis results requires strong theoretical knowledge. This book aims at combining the development of both theoretical and applicationrelated data analysis knowledge. The text is designed for advanced level studies and assumes acquaintance with elementary statistical terms. After a brief introduction to selected mathematical concepts, the highlighting of selected model features is followed by a practice-oriented introduction to the interpretation of SPSS1 outputs for the described data analysis methods. Learning of data analysis is usually time-consuming and requires efforts, but with tenacity the learning process can bring about a significant improvement of individual data analysis skills.
Resumo:
The objectives of this research are to analyze and develop a modified Principal Component Analysis (PCA) and to develop a two-dimensional PCA with applications in image processing. PCA is a classical multivariate technique where its mathematical treatment is purely based on the eigensystem of positive-definite symmetric matrices. Its main function is to statistically transform a set of correlated variables to a new set of uncorrelated variables over $\IR\sp{n}$ by retaining most of the variations present in the original variables.^ The variances of the Principal Components (PCs) obtained from the modified PCA form a correlation matrix of the original variables. The decomposition of this correlation matrix into a diagonal matrix produces a set of orthonormal basis that can be used to linearly transform the given PCs. It is this linear transformation that reproduces the original variables. The two-dimensional PCA can be devised as a two successive of one-dimensional PCA. It can be shown that, for an $m\times n$ matrix, the PCs obtained from the two-dimensional PCA are the singular values of that matrix.^ In this research, several applications for image analysis based on PCA are developed, i.e., edge detection, feature extraction, and multi-resolution PCA decomposition and reconstruction. ^
Resumo:
Objectives: The primary aim of this study was to investigate partially dentate elders' willingness-to-pay (WTP) for two different tooth replacement strategies: Removable Partial Dentures (RPDs) and, functionally orientated treatment according to the principles of the Shortened Dental Arch (SDA). The secondary aim was to measure the same patient groups' WTP for dental implant treatment.Methods: 55 patients who had completed a previous RCT comparing two tooth replacement strategies (RPDs (n=27) and SDA (n=28)) were recruited (Trial Registration no. ISRCTN26302774). Patients were asked to indicate their WTP for treatment to replace missing teeth in a number of hypothetical scenarios using the payment card method of contingency evaluation coupled to different costs. Data were collected on patients' social class, income levels and other social circumstances. A Mann-Whitney U Test was used to compare differences in WTP between the two treatment groups. To investigate predictive factors for WTP, multiple linear regression analyses were conducted.Results: The median age for the patient sample was 72.0 years (IQR: 71-75 years). Patients who had been provided with RPDs indicated that their WTP for this treatment strategy was significantly higher (€550; IQR: 500-650) than those patients who had received SDA treatment (€500; IQR: 450-550) (p=0.003). However patients provided with RPDs indicated that their WTP for SDA treatment (€650; IQR: 600-650) was also significantly higher than those patients who had actually received functionally orientated treatment (€550; IQR: 500-600) (p<0.001). The results indicated that both current income levels and previous treatment allocation were significantly correlated to WTP for both the RPD and the SDA groups. Patients in both treatment groups exhibited little WTP for dental implant treatment with a median value recorded which was half the market value for this treatment (€1000; IQR: 500-1000).Conclusions: Amongst this patient cohort previous treatment experience had a strong influence on WTP as did current income levels. Both treatment groups indicated a very strong WTP for simpler, functionally orientated care using adhesive fixed prostheses (SDA) over conventional RPDs. Clinical significance: Partially dentate older patients expressed a strong preference for functionally orientated tooth replacement as an alternative to conventional RPDs.
Resumo:
Virtual topology operations have been utilized to generate an analysis topology definition suitable for downstream mesh generation. Detailed descriptions are provided for virtual topology merge and split operations for all topological entities. Current virtual topology technology is extended to allow the virtual partitioning of volume cells and the topological queries required to carry out each operation are provided. Virtual representations are robustly linked to the underlying geometric definition through an analysis topology. The analysis topology and all associated virtual and topological dependencies are automatically updated after each virtual operation, providing the link to the underlying CAD geometry. Therefore, a valid description of the analysis topology, including relative orientations, is maintained. This enables downstream operations, such as the merging or partitioning of virtual entities, and interrogations, such as determining if a specific meshing strategy can be applied to the virtual volume cells, to be performed on the analysis topology description. As the virtual representation is a non-manifold description of the sub-divided domain the interfaces between cells are recorded automatically. This enables the advantages of non-manifold modelling to be exploited within the manifold modelling environment of a major commercial CAD system, without any adaptation of the underlying CAD model. A hierarchical virtual structure is maintained where virtual entities are merged or partitioned. This has a major benefit over existing solutions as the virtual dependencies are stored in an open and accessible manner, providing the analyst with the freedom to create, modify and edit the analysis topology in any preferred sequence, whilst the original CAD geometry is not disturbed. Robust definitions of the topological and virtual dependencies enable the same virtual topology definitions to be accessed, interrogated and manipulated within multiple different CAD packages and linked to the underlying geometry.