945 resultados para Lead Analysis Data processing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

All-optical data processing is expected to play a major role in future optical communications. Nonlinear effects in optical fibers have attractive applications in optical signal processing. In this paper, we review our recent advances in developing all-optical processing techniques at high speed based on optical fiber nonlinearities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

All-optical data processing is expected to play a major role in future optical communications. Nonlinear effects in optical fibres have many attractive features and a great, not yet fully explored potential in optical signal processing. Here, we overview our recent advances in developing novel techniques and approaches to all-optical processing based on optical fibre nonlinearities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

All-optical data processing is expected to play a major role in future optical communications. Nonlinear effects in optical fibers have attractive applications in optical signal processing. In this paper, we review our recent advances in developing all-optical processing techniques at high speed based on optical fiber nonlinearities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

All-optical technologies for data processing and signal manipulation are expected to play a major role in future optical communications. Nonlinear phenomena occurring in optical fibre have many attractive features and great, but not yet fully exploited potential in optical signal processing. Here, we overview our recent results and advances in developing novel photonic techniques and approaches to all-optical processing based on fibre nonlinearities. Amongst other topics, we will discuss phase-preserving optical 2R regeneration, the possibility of using parabolic/flat-top pulses for optical signal processing and regeneration, and nonlinear optical pulse shaping. A method for passive nonlinear pulse shaping based on pulse pre-chirping and propagation in a normally dispersive fibre will be presented. The approach provides a simple way of generating various temporal waveforms of fundamental and practical interest. Particular emphasis will be given to the formation and characterization of pulses with a triangular intensity profile. A new technique of doubling/copying optical pulses in both the frequency and time domains using triangular-shaped pulses will be also introduced.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

GraphChi is the first reported disk-based graph engine that can handle billion-scale graphs on a single PC efficiently. GraphChi is able to execute several advanced data mining, graph mining and machine learning algorithms on very large graphs. With the novel technique of parallel sliding windows (PSW) to load subgraph from disk to memory for vertices and edges updating, it can achieve data processing performance close to and even better than those of mainstream distributed graph engines. GraphChi mentioned that its memory is not effectively utilized with large dataset, which leads to suboptimal computation performances. In this paper we are motivated by the concepts of 'pin ' from TurboGraph and 'ghost' from GraphLab to propose a new memory utilization mode for GraphChi, which is called Part-in-memory mode, to improve the GraphChi algorithm performance. The main idea is to pin a fixed part of data inside the memory during the whole computing process. Part-in-memory mode is successfully implemented with only about 40 additional lines of code to the original GraphChi engine. Extensive experiments are performed with large real datasets (including Twitter graph with 1.4 billion edges). The preliminary results show that Part-in-memory mode memory management approach effectively reduces the GraphChi running time by up to 60% in PageRank algorithm. Interestingly it is found that a larger portion of data pinned in memory does not always lead to better performance in the case that the whole dataset cannot be fitted in memory. There exists an optimal portion of data which should be kept in the memory to achieve the best computational performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

* The research was supported by INTAS 00-397 and 00-626 Projects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of new all-optical technologies for data processing and signal manipulation is a field of growing importance with a strong potential for numerous applications in diverse areas of modern science. Nonlinear phenomena occurring in optical fibres have many attractive features and great, but not yet fully explored, potential in signal processing. Here, we review recent progress on the use of fibre nonlinearities for the generation and shaping of optical pulses and on the applications of advanced pulse shapes in all-optical signal processing. Amongst other topics, we will discuss ultrahigh repetition rate pulse sources, the generation of parabolic shaped pulses in active and passive fibres, the generation of pulses with triangular temporal profiles, and coherent supercontinuum sources. The signal processing applications will span optical regeneration, linear distortion compensation, optical decision at the receiver in optical communication systems, spectral and temporal signal doubling, and frequency conversion. © Copyright 2012 Sonia Boscolo and Christophe Finot.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the nonparametric framework of Data Envelopment Analysis the statistical properties of its estimators have been investigated and only asymptotic results are available. For DEA estimators results of practical use have been proved only for the case of one input and one output. However, in the real world problems the production process is usually well described by many variables. In this paper a machine learning approach to variable aggregation based on Canonical Correlation Analysis is presented. This approach is applied for efficiency estimation of all the farms in Terceira Island of the Azorean archipelago.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents an algorithmic solution for management of related text objects, in which are integrated algorithms for their extraction from paper or electronic format, for their storage and processing in a relational database. The developed algorithms for data extraction and data analysis enable one to find specific features and relations between the text objects from the database. The algorithmic solution is applied to data from the field of phytopharmacy in Bulgaria. It can be used as a tool and methodology for other subject areas where there are complex relationships between text objects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study analyses the current role of police-suspect interview discourse in the England & Wales criminal justice system, with a focus on its use as evidence. A central premise is that the interview should be viewed not as an isolated and self-contained discursive event, but as one link in a chain of events which together constitute the criminal justice process. It examines: (1) the format changes undergone by interview data after the interview has taken place, and (2) how the other links in the chain – both before and after the interview – affect the interview-room interaction itself. It thus examines the police interview as a multi-format, multi-purpose and multi-audience mode of discourse. An interdisciplinary and multi-method discourse-analytic approach is taken, combining elements of conversation analysis, pragmatics, sociolinguistics and critical discourse analysis. Data from a new corpus of recent police-suspect interviews, collected for this study, are used to illustrate previously unaddressed problems with the current process, mainly in the form of two detailed case studies. Additional data are taken from the case of Dr. Harold Shipman. The analysis reveals several causes for concern, both in aspects of the interaction in the interview room, and in the subsequent treatment of interview material as evidence, especially in the light of s.34 of the Criminal Justice and Public Order Act 1994. The implications of the findings for criminal justice are considered, along with some practical recommendations for improvements. Overall, this study demonstrates the need for increased awareness within the criminal justice system of the many linguistic factors affecting interview evidence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Field material testing provides firsthand information on pavement conditions which are most helpful in evaluating performance and identifying preventive maintenance or overlay strategies. High variability of field asphalt concrete due to construction raises the demand for accuracy of the test. Accordingly, the objective of this study is to propose a reliable and repeatable methodology to evaluate the fracture properties of field-aged asphalt concrete using the overlay test (OT). The OT is selected because of its efficiency and feasibility for asphalt field cores with diverse dimensions. The fracture properties refer to the Paris’ law parameters based on the pseudo J-integral (A and n) because of the sound physical significance of the pseudo J-integral with respect to characterizing the cracking process. In order to determine A and n, a two-step OT protocol is designed to characterize the undamaged and damaged behaviors of asphalt field cores. To ensure the accuracy of determined undamaged and fracture properties, a new analysis method is then developed for data processing, which combines the finite element simulations and mechanical analysis of viscoelastic force equilibrium and evolution of pseudo displacement work in the OT specimen. Finally, theoretical equations are derived to calculate A and n directly from the OT test data. The accuracy of the determined fracture properties is verified. The proposed methodology is applied to a total of 27 asphalt field cores obtained from a field project in Texas, including the control Hot Mix Asphalt (HMA) and two types of warm mix asphalt (WMA). The results demonstrate a high linear correlation between n and −log A for all the tested field cores. Investigations of the effect of field aging on the fracture properties confirm that n is a good indicator to quantify the cracking resistance of asphalt concrete. It is also indicated that summer climatic condition clearly accelerates the rate of aging. The impact of the WMA technologies on fracture properties of asphalt concrete is visualized by comparing the n-values. It shows that the Evotherm WMA technology slightly improves the cracking resistance, while the foaming WMA technology provides the comparable fracture properties with the HMA. After 15 months aging in the field, the cracking resistance does not exhibit significant difference between HMA and WMAs, which is confirmed by the observations of field distresses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A poly(L-lactide-co-caprolactone) copolymer, P(LL-co-CL), of composition 75:25 mol% was synthesized via the bulk ring-opening copolymerization of L-lactide and ε-caprolactone using a novel bis[tin(II) monooctoate] diethylene glycol coordination-insertion initiator, OctSn-OCH2CH2OCH2CH2O-SnOct. The P(LL-co-CL) copolymer obtained was characterized by a combination of analytical techniques, namely nuclear magnetic resonance spectroscopy, gel permeation chromatography, dilute-solution viscometry, differential scanning calorimetry, and thermogravimetric analysis. For processing into a monofilament fiber, the copolymer was melt spun with minimal draw to give a largely amorphous and unoriented as-spun fiber. The fiber's oriented semicrystalline morphology, necessary to give the required balance of mechanical properties, was then developed via a sequence of controlled offline hot-drawing and annealing steps. Depending on the final draw ratio, the fibers obtained had tensile strengths in the region of 200–400 MPa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research presents several components encompassing the scope of the objective of Data Partitioning and Replication Management in Distributed GIS Database. Modern Geographic Information Systems (GIS) databases are often large and complicated. Therefore data partitioning and replication management problems need to be addresses in development of an efficient and scalable solution. ^ Part of the research is to study the patterns of geographical raster data processing and to propose the algorithms to improve availability of such data. These algorithms and approaches are targeting granularity of geographic data objects as well as data partitioning in geographic databases to achieve high data availability and Quality of Service(QoS) considering distributed data delivery and processing. To achieve this goal a dynamic, real-time approach for mosaicking digital images of different temporal and spatial characteristics into tiles is proposed. This dynamic approach reuses digital images upon demand and generates mosaicked tiles only for the required region according to user's requirements such as resolution, temporal range, and target bands to reduce redundancy in storage and to utilize available computing and storage resources more efficiently. ^ Another part of the research pursued methods for efficient acquiring of GIS data from external heterogeneous databases and Web services as well as end-user GIS data delivery enhancements, automation and 3D virtual reality presentation. ^ There are vast numbers of computing, network, and storage resources idling or not fully utilized available on the Internet. Proposed "Crawling Distributed Operating System "(CDOS) approach employs such resources and creates benefits for the hosts that lend their CPU, network, and storage resources to be used in GIS database context. ^ The results of this dissertation demonstrate effective ways to develop a highly scalable GIS database. The approach developed in this dissertation has resulted in creation of TerraFly GIS database that is used by US government, researchers, and general public to facilitate Web access to remotely-sensed imagery and GIS vector information. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A major consequence of contamination at the local level’s population as it relates to environmental health and environmental engineering is childhood lead poisoning. Environmental contamination is one of the pressing environmental concerns facing the world today. Current approaches often focus on large contaminated industrial size sites that are designated by regulatory agencies for site remediation. Prior to this study, there were no known published studies conducted at the local and smaller scale, such as neighborhoods, where often much of the contamination is present to remediate. An environmental health study of local lead-poisoning data in Liberty City, Little Haiti and eastern Little Havana in Miami-Dade County, Florida accounted for a disproportionately high number of the county’s reported childhood lead poisoning cases. An engineering system was developed and designed for a comprehensive risk management methodology that is distinctively applicable to the geographical and environmental conditions of Miami-Dade County, Florida. Furthermore, a scientific approach for interpreting environmental health concerns, while involving detailed environmental engineering control measures and methods for site remediation in contained media was developed for implementation. Test samples were obtained from residents and sites in those specific communities in Miami-Dade County, Florida (Gasana and Chamorro 2002). Currently lead does not have an Oral Assessment, Inhalation Assessment, and Oral Slope Factor; variables that are required to run a quantitative risk assessment. However, various institutional controls from federal agencies’ standards and regulation for contaminated lead in media yield adequate maximum concentration limits (MCLs). For this study an MCL of .0015 (mg/L) was used. A risk management approach concerning contaminated media involving lead demonstrates that the linkage of environmental health and environmental engineering can yield a feasible solution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Accurately assessing the extent of myocardial tissue injury induced by Myocardial infarction (MI) is critical to the planning and optimization of MI patient management. With this in mind, this study investigated the feasibility of using combined fluorescence and diffuse reflectance spectroscopy to characterize a myocardial infarct at the different stages of its development. An animal study was conducted using twenty male Sprague-Dawley rats with MI. In vivo fluorescence spectra at 337 nm excitation and diffuse reflectance between 400 nm and 900 nm were measured from the heart using a portable fiber-optic spectroscopic system. Spectral acquisition was performed on (1) the normal heart region; (2) the region immediately surrounding the infarct; and (3) the infarcted region—one, two, three and four weeks into MI development. The spectral data were divided into six subgroups according to the histopathological features associated with various degrees/severities of myocardial tissue injury as well as various stages of myocardial tissue remodeling, post infarction. Various data processing and analysis techniques were employed to recognize the representative spectral features corresponding to various histopathological features associated with myocardial infarction. The identified spectral features were utilized in discriminant analysis to further evaluate their effectiveness in classifying tissue injuries induced by MI. In this study, it was observed that MI induced significant alterations (p < 0.05) in the diffuse reflectance spectra, especially between 450 nm and 600 nm, from myocardial tissue within the infarcted and surrounding regions. In addition, MI induced a significant elevation in fluorescence intensities at 400 and 460 nm from the myocardial tissue from the same regions. The extent of these spectral alterations was related to the duration of the infarction. Using the spectral features identified, an effective tissue injury classification algorithm was developed which produced a satisfactory overall classification result (87.8%). The findings of this research support the concept that optical spectroscopy represents a useful tool to non-invasively determine the in vivo pathophysiological features of a myocardial infarct and its surrounding tissue, thereby providing valuable real-time feedback to surgeons during various surgical interventions for MI.