973 resultados para PROCESSING CHARACTERISTICS
Resumo:
The present investigation is based on a linguistic analysis of the 'Housing Act 1980' and attempts to examine the role of qualifications in the structuring of the legislative statement. The introductory chapter isolates legislative writing as a "sub-variety “of legal language and provides an overview of the controversies surrounding the way it is written and the problems it poses to its readers. Chapter two emphasizes the limitations of the available work on the description of language-varieties for the analysis of legislative writing and outlines the approach adopted for the present analysis. This chapter also gives some idea of the information-structuring of legislative provisions and establishes qualification as a key element in their textualisation. The next three chapters offer a detailed account of the ten major qualification-types identified in the corpus, concentrating on the surface form they take, the features of legislative statements they textualize and the syntactic positions to which they are generally assigned in the statement of legislative provisions. The emerging hypotheses in these chapters have often been verified through a specialist reaction from a Parliamentary Counsel, largely responsible for the writing of the ‘Housing Act 1980’• The findings suggest useful correlations between a number of qualificational initiators and the various aspects of the legislative statement. They also reveal that many of these qualifications typically occur in those clause-medial syntactic positions which are sparingly used in other specialist discourse, thus creating syntactic discontinuity in the legislative sentence. Such syntactic discontinuities, on the evidence from psycholinguistic experiments reported in chapter six, create special problems in the processing and comprehension of legislative statements. The final chapter converts the main linguistic findings into a series of pedagogical generalizations, offers indications of how this may be applied in EALP situations and concludes with other considerations of possible applications.
Resumo:
Two reactive comonomers, divinyl benzene (DVB) and trimethylolpropane triacrylate (TRIS), were evaluated for their role in effecting the melt free radical grafting reaction of the monomer glycidyl methacrylate (GMA) onto polypropylene (PP). The characteristics of the GMA-grafting systems in the presence and absence of DVB or TRIS were examined and compared in terms of the yield of the grafting reaction and the extent of the main side reactions, namely homopolymerisation of GMA (poly-GMA) and polymer degradation, using different chemical compositions of the reactive systems and processing conditions. In the absence of the comonomers, i.e. in a conventional system, high initiator concentrations of peroxides were typically required to achieve the highest possible GMA grafting levels which were found to be generally low. Concomitantly, both poly-GMA and degradation of the polymer by chain scission takes place with increasing initiator amounts. On the other hand, the presence of a small amount of the comonomers, DVB or Tris, in the GMA-grafting system, was shown to bring about a significant increase in the grafting level paralleled by a large reduction in poly-GMA and PP degradation. In the presence of these highly reactive comonomers, the optimum grafting system requires a much lower concentration of the peroxide initiator and, consequently, would lead to the much lower degree of polymer degradation observed in these systems. The differences in the effects of the presence of DVB and that of TRIS in the grafting systems on the rate of the GMA-grafting and homopolymerisation reactions, and the extent of PP degradation (through melt flow changes), were compared and contrasted with a conventional GMA-grafting system.
Resumo:
The number of interoperable research infrastructures has increased significantly with the growing awareness of the efforts made by the Global Earth Observation System of Systems (GEOSS). One of the Societal Benefit Areas (SBA) that is benefiting most from GEOSS is biodiversity, given the costs of monitoring the environment and managing complex information, from space observations to species records including their genetic characteristics. But GEOSS goes beyond simple data sharing to encourage the publishing and combination of models, an approach which can ease the handling of complex multi-disciplinary questions. It is the purpose of this paper to illustrate these concepts by presenting eHabitat, a basic Web Processing Service (WPS) for computing the likelihood of finding ecosystems with equal properties to those specified by a user. When chained with other services providing data on climate change, eHabitat can be used for ecological forecasting and becomes a useful tool for decision-makers assessing different strategies when selecting new areas to protect. eHabitat can use virtually any kind of thematic data that can be considered as useful when defining ecosystems and their future persistence under different climatic or development scenarios. The paper will present the architecture and illustrate the concepts through case studies which forecast the impact of climate change on protected areas or on the ecological niche of an African bird.
Resumo:
Background - When a moving stimulus and a briefly flashed static stimulus are physically aligned in space the static stimulus is perceived as lagging behind the moving stimulus. This vastly replicated phenomenon is known as the Flash-Lag Effect (FLE). For the first time we employed biological motion as the moving stimulus, which is important for two reasons. Firstly, biological motion is processed by visual as well as somatosensory brain areas, which makes it a prime candidate for elucidating the interplay between the two systems with respect to the FLE. Secondly, discussions about the mechanisms of the FLE tend to recur to evolutionary arguments, while most studies employ highly artificial stimuli with constant velocities. Methodology/Principal Finding - Since biological motion is ecologically valid it follows complex patterns with changing velocity. We therefore compared biological to symbolic motion with the same acceleration profile. Our results with 16 observers revealed a qualitatively different pattern for biological compared to symbolic motion and this pattern was predicted by the characteristics of motor resonance: The amount of anticipatory processing of perceived actions based on the induced perspective and agency modulated the FLE. Conclusions/Significance - Our study provides first evidence for an FLE with non-linear motion in general and with biological motion in particular. Our results suggest that predictive coding within the sensorimotor system alone cannot explain the FLE. Our findings are compatible with visual prediction (Nijhawan, 2008) which assumes that extrapolated motion representations within the visual system generate the FLE. These representations are modulated by sudden visual input (e.g. offset signals) or by input from other systems (e.g. sensorimotor) that can boost or attenuate overshooting representations in accordance with biased neural competition (Desimone & Duncan, 1995).
Resumo:
Sensory processing is a crucial underpinning of the development of social cognition, a function which is compromised in variable degree in patients with pervasive developmental disorders (PDD). In this manuscript, we review some of the most recent and relevant contributions, which have looked at auditory sensory processing derangement in PDD. The variability in the clinical characteristics of the samples studied so far, in terms of severity of the associated cognitive deficits and associated limited compliance, underlying aetiology and demographic features makes a univocal interpretation arduous. We hypothesise that, in patients with severe mental deficits, the presence of impaired auditory sensory memory as expressed by the mismatch negativity could be a non-specific indicator of more diffuse cortical deficits rather than causally related to the clinical symptomatology. More consistent findings seem to emerge from studies on less severely impaired patients, in whom increased pitch perception has been interpreted as an indicator of increased local processing, probably as compensatory mechanism for the lack of global processing (central coherence). This latter hypothesis seems extremely attractive and future trials in larger cohorts of patients, possibly standardising the characteristics of the stimuli are a much-needed development. Finally, specificity of the role of the auditory derangement as opposed to other sensory channels needs to be assessed more systematically using multimodal stimuli in the same patient group. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
Novel surface plasmonic optical fiber sensors have been fabricated using multiple coatings deposited on a lapped section of a single mode fiber. UV laser irradiation processing with a phase mask produces a nano-scaled surface relief grating structure resembling nano-wires. The resulting individual corrugations produced by material compaction are approximately 20 μm long with an average width at half maximum of 100 nm and generate localized surface plasmons. Experimental data are presented that show changes in the spectral characteristics after UV processing, coupled with an overall increase in the sensitivity of the devices to surrounding refractive index. Evidence is presented that there is an optimum UV dosage (48 joules) over which no significant additional optical change is observed. The devices are characterized with regards to change in refractive index, where significantly high spectral sensitivities in the aqueous index regime are found, ranging up to 4000 nm/RIU for wavelength and 800 dB/RIU for intensity. © 2013 Optical Society of America.
Resumo:
The influence of the comonomer content in a series of metallocene-based ethylene-1-octene copolymers (m-LLDPE) on thermo-mechanical, rheological, and thermo-oxidative behaviours during melt processing were examined using a range of characterisation techniques. The amount of branching was calculated from 13C NMR and studies using differential scanning calorimetry (DSC) and dynamic mechanical analysis (DMA) were employed to determine the effect of short chain branching (SCB, comonomer content) on thermal and mechanical characteristics of the polymer. The effect of melt processing at different temperatures on the thermo-oxidative behaviour of the polymers was investigated by examining the changes in rheological properties, using both melt flow and capillary rheometry, and the evolution of oxidation products during processing using infrared spectroscopy. The results show that the comonomer content and catalyst type greatly affect thermal, mechanical and oxidative behaviour of the polymers. For the metallocene polymer series, it was shown from both DSC and DMA that (i) crystallinity and melting temperatures decreased linearly with comonomer content, (ii) the intensity of the ß-transition increased, and (iii) the position of the tan δmax peak corresponding to the a-transition shifted to lower temperatures, with higher comonomer content. In contrast, a corresponding Ziegler polymer containing the same level of SCB as in one of the m-LLDPE polymers, showed different characteristics due to its more heterogeneous nature: higher elongational viscosity, and a double melting peak with broader intensity that occurred at higher temperature (from DSC endotherm) indicating a much broader short chain branch distribution. The thermo-oxidative behaviour of the polymers after melt processing was similarly influenced by the comonomer content. Rheological characteristics and changes in concentrations of carbonyl and the different unsaturated groups, particularly vinyl, vinylidene and trans-vinylene, during processing of m-LLDPE polymers, showed that polymers with lower levels of SCB gave rise to predominantly crosslinking reactions at all processing temperatures. By contrast, chain scission reactions at higher processing temperatures became more favoured in the higher comonomer-containing polymers. Compared to its metallocene analogue, the Ziegler polymer showed a much higher degree of crosslinking at all temperatures because of the high levels of vinyl unsaturation initially present.
Resumo:
Glycidyl methacrylate (GMA) was grafted on ethylene-propylene copolymer during melt processing with peroxide initiation in the presence and absence of a more reactive comonomer (coagent), trimethylolpropane triacrylate (Tris). The characteristics of the grafting systems in terms of the grafting reaction yield and the nature and extent of the competing side reactions were examined. The homopolymers of GMA (Poly-GMA) and Tris (Poly-Tris) and the GMA-Tris copolymer (GMA-co-Tris) were synthesized and characterized. In the absence of the coagent, high levels of poly-GMA, which constituted the major competing reaction, was formed, giving rise to low GMA grafting levels. Further, this grafting system resulted in a high extent of gel formation and polymer crosslinking due to the high levels of peroxide needed to achieve optimum GMA grafting and a consequent large drop in the melt index (increased viscosity) of the polymer. In the presence of the coagent, however, the grafting system required much lower peroxide concentration, by almost an order of magnitude, to achieve the optimum grafting yield. The coagent-containing GMA-grafting system has also resulted in a drastic reduction in the extent of all competing reactions, and in particular, the GMA homopolymerization, leading to improved GMA grafting efficiency with no detectable gel or crosslinking. The mechanisms of the grafting reactions, in the presence and absence of Tris, are proposed.
Resumo:
This thesis is a study of performance management of Complex Event Processing (CEP) systems. Since CEP systems have distinct characteristics from other well-studied computer systems such as batch and online transaction processing systems and database-centric applications, these characteristics introduce new challenges and opportunities to the performance management for CEP systems. Methodologies used in benchmarking CEP systems in many performance studies focus on scaling the load injection, but not considering the impact of the functional capabilities of CEP systems. This thesis proposes the approach of evaluating the performance of CEP engines’ functional behaviours on events and develops a benchmark platform for CEP systems: CEPBen. The CEPBen benchmark platform is developed to explore the fundamental functional performance of event processing systems: filtering, transformation and event pattern detection. It is also designed to provide a flexible environment for exploring new metrics and influential factors for CEP systems and evaluating the performance of CEP systems. Studies on factors and new metrics are carried out using the CEPBen benchmark platform on Esper. Different measurement points of response time in performance management of CEP systems are discussed and response time of targeted event is proposed to be used as a metric for quality of service evaluation combining with the traditional response time in CEP systems. Maximum query load as a capacity indicator regarding to the complexity of queries and number of live objects in memory as a performance indicator regarding to the memory management are proposed in performance management of CEP systems. Query depth is studied as a performance factor that influences CEP system performance.
Resumo:
This thesis describes advances in the characterisation, calibration and data processing of optical coherence tomography (OCT) systems. Femtosecond (fs) laser inscription was used for producing OCT-phantoms. Transparent materials are generally inert to infra-red radiations, but with fs lasers material modification occurs via non-linear processes when the highly focused light source interacts with the materials. This modification is confined to the focal volume and is highly reproducible. In order to select the best inscription parameters, combination of different inscription parameters were tested, using three fs laser systems, with different operating properties, on a variety of materials. This facilitated the understanding of the key characteristics of the produced structures with the aim of producing viable OCT-phantoms. Finally, OCT-phantoms were successfully designed and fabricated in fused silica. The use of these phantoms to characterise many properties (resolution, distortion, sensitivity decay, scan linearity) of an OCT system was demonstrated. Quantitative methods were developed to support the characterisation of an OCT system collecting images from phantoms and also to improve the quality of the OCT images. Characterisation methods include the measurement of the spatially variant resolution (point spread function (PSF) and modulation transfer function (MTF)), sensitivity and distortion. Processing of OCT data is a computer intensive process. Standard central processing unit (CPU) based processing might take several minutes to a few hours to process acquired data, thus data processing is a significant bottleneck. An alternative choice is to use expensive hardware-based processing such as field programmable gate arrays (FPGAs). However, recently graphics processing unit (GPU) based data processing methods have been developed to minimize this data processing and rendering time. These processing techniques include standard-processing methods which includes a set of algorithms to process the raw data (interference) obtained by the detector and generate A-scans. The work presented here describes accelerated data processing and post processing techniques for OCT systems. The GPU based processing developed, during the PhD, was later implemented into a custom built Fourier domain optical coherence tomography (FD-OCT) system. This system currently processes and renders data in real time. Processing throughput of this system is currently limited by the camera capture rate. OCTphantoms have been heavily used for the qualitative characterization and adjustment/ fine tuning of the operating conditions of OCT system. Currently, investigations are under way to characterize OCT systems using our phantoms. The work presented in this thesis demonstrate several novel techniques of fabricating OCT-phantoms and accelerating OCT data processing using GPUs. In the process of developing phantoms and quantitative methods, a thorough understanding and practical knowledge of OCT and fs laser processing systems was developed. This understanding leads to several novel pieces of research that are not only relevant to OCT but have broader importance. For example, extensive understanding of the properties of fs inscribed structures will be useful in other photonic application such as making of phase mask, wave guides and microfluidic channels. Acceleration of data processing with GPUs is also useful in other fields.
Resumo:
* The following text has been originally published in the Proceedings of the Language Recourses and Evaluation Conference held in Lisbon, Portugal, 2004, under the title of "Towards Intelligent Written Cultural Heritage Processing - Lexical processing". I present here a revised contribution of the aforementioned paper and I add here the latest efforts done in the Center for Computational Linguistic in Prague in the field under discussion.
Resumo:
Melt processing is a critical step in the manufacture of polymer articles and is even more critical when dealing with inhomogeneous polymer-clay nanocomposites systems. The chemical composition, and in particular the clay type and its organic modification, also plays a major contribution in determining the final properties and in particular the thermal and long-term oxidative stability of the resulting polymer nanocomposites. Proper selection and tuning of the process variable should, in principle, lead to improved characteristics of the fabricated product. With multiphase systems containing inorganic nanoclays, however, this is not straightforward and it is often the case that the process conditions are chosen initially to improve one or more desired properties at the expense of others. This study assesses the influence of organo-modified clays and the processing parameters (extrusion temperature and screw speed) on the rheological and morphological characteristics of polymer nanocomposites as well as on their melt and thermo-oxidative stability. Nanocomposites (PPNCs) based on PP, maleated PP and organically modified clays were prepared in different co-rotating twin-screw extruders ranging from laboratory scale to semi-industrial scale. Results show that the amount of surfactant present in similar organo-modified clays affects differently the thermo-oxidative stability of the extruded PPNCs and that changes in processing conditions affect the clay morphology too. By choosing an appropriate set of tuned process variables for the extrusion process it would be feasible to selectively fabricate polymer-clay nanocomposites, with the desired mechanical and thermo-oxidative characteristics. © 2013 Elsevier Ltd. All rights reserved.
Resumo:
Parallel processing is prevalent in many manufacturing and service systems. Many manufactured products are built and assembled from several components fabricated in parallel lines. An example of this manufacturing system configuration is observed at a manufacturing facility equipped to assemble and test web servers. Characteristics of a typical web server assembly line are: multiple products, job circulation, and paralleling processing. The primary objective of this research was to develop analytical approximations to predict performance measures of manufacturing systems with job failures and parallel processing. The analytical formulations extend previous queueing models used in assembly manufacturing systems in that they can handle serial and different configurations of paralleling processing with multiple product classes, and job circulation due to random part failures. In addition, appropriate correction terms via regression analysis were added to the approximations in order to minimize the gap in the error between the analytical approximation and the simulation models. Markovian and general type manufacturing systems, with multiple product classes, job circulation due to failures, and fork and join systems to model parallel processing were studied. In the Markovian and general case, the approximations without correction terms performed quite well for one and two product problem instances. However, it was observed that the flow time error increased as the number of products and net traffic intensity increased. Therefore, correction terms for single and fork-join stations were developed via regression analysis to deal with more than two products. The numerical comparisons showed that the approximations perform remarkably well when the corrections factors were used in the approximations. In general, the average flow time error was reduced from 38.19% to 5.59% in the Markovian case, and from 26.39% to 7.23% in the general case. All the equations stated in the analytical formulations were implemented as a set of Matlab scripts. By using this set, operations managers of web server assembly lines, manufacturing or other service systems with similar characteristics can estimate different system performance measures, and make judicious decisions - especially setting delivery due dates, capacity planning, and bottleneck mitigation, among others.
Resumo:
While most studies take a dyadic view when examining the environmental difference between the home country of a multinational enterprise (MNE) and a particular foreign country, they ignore that an MNE is managing a network of subsidiaries embedded in diverse environments. Additionally, neither the impacts of global environments on top executives nor the effects of top executives’ capabilities to handle institutional complexity are fully explored. Thus, using a three-essay format, this dissertation tried to fill these gaps by addressing the effects of institutional complexity and top management characteristics on top executive compensation and firm performance. ^ Essay 1 investigated the impact of an MNE’s institutional complexity, or the diversity of national institutions facing an MNE’s network of subsidiaries, on the top management team (TMT) compensation. This essay proposed that greater political and cultural complexity leads to not only greater TMT total compensation but also to a greater portion of TMT compensation linked with long-term performance. The arguments are supported in this essay by using an unbalanced panel dataset including 296 U.S. firms with 1,340 observations. ^ Essay 2 explored TMT social capital and its moderating role on value creation and appropriation by the chief executive officer (CEO). Using a sample with 548 U.S. firms and 2,010 observations, it found that greater TMT social capital does facilitate the effects of CEO intellectual capital and social capital on firm growth. Finally, essay 3 examined the performance implications for the fit between managerial information-processing capabilities and institutional complexity. It proposed that institutional complexity is associated with the needs of information-processing. On the other hand, smaller TMT turnover and larger TMT size reflect larger managerial information-processing capabilities. Consequently, superior performance is achieved by the match among institutional complexity, TMT turnover, and TMT size. All hypotheses in essay 3 are supported in a sample of 301 U.S. firms and 1,404 observations. ^ To conclude, this dissertation advances and extends our knowledge on the roles of institutional environments and top executives on firm performance and top executive compensation.^
Resumo:
Intraoperative neurophysiologic monitoring is an integral part of spinal surgeries and involves the recording of somatosensory evoked potentials (SSEP). However, clinical application of IONM still requires anywhere between 200 to 2000 trials to obtain an SSEP signal, which is excessive and introduces a significant delay during surgery to detect a possible neurological damage. The aim of this study is to develop a means to obtain the SSEP using a much less, twelve number of recordings. The preliminary step involved was to distinguish the SSEP with the ongoing brain activity. We first establish that the brain activity is indeed quasi-stationary whereas an SSEP is expected to be identical every time a trial is recorded. An algorithm was developed using Chebychev time windowing for preconditioning of SSEP trials to retain the morphological characteristics of somatosensory evoked potentials (SSEP). This preconditioning was followed by the application of a principal component analysis (PCA)-based algorithm utilizing quasi-stationarity of EEG on 12 preconditioned trials. A unique Walsh transform operation was then used to identify the position of the SSEP event. An alarm is raised when there is a 10% time in latency deviation and/or 50% peak-to-peak amplitude deviation, as per the clinical requirements. The algorithm shows consistency in the results in monitoring SSEP in up to 6-hour surgical procedures even under this significantly reduced number of trials. In this study, the analysis was performed on the data recorded in 29 patients undergoing surgery during which the posterior tibial nerve was stimulated and SSEP response was recorded from scalp. This method is shown empirically to be more clinically viable than present day approaches. In all 29 cases, the algorithm takes 4sec to extract an SSEP signal, as compared to conventional methods, which take several minutes. The monitoring process using the algorithm was successful and proved conclusive under the clinical constraints throughout the different surgical procedures with an accuracy of 91.5%. Higher accuracy and faster execution time, observed in the present study, in determining the SSEP signals provide a much improved and effective neurophysiological monitoring process.