949 resultados para Intelligent computing techniques


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Today, databases have become an integral part of information systems. In the past two decades, we have seen different database systems being developed independently and used in different applications domains. Today's interconnected networks and advanced applications, such as data warehousing, data mining & knowledge discovery and intelligent data access to information on the Web, have created a need for integrated access to such heterogeneous, autonomous, distributed database systems. Heterogeneous/multidatabase research has focused on this issue resulting in many different approaches. However, a single, generally accepted methodology in academia or industry has not emerged providing ubiquitous intelligent data access from heterogeneous, autonomous, distributed information sources. ^ This thesis describes a heterogeneous database system being developed at High-performance Database Research Center (HPDRC). A major impediment to ubiquitous deployment of multidatabase technology is the difficulty in resolving semantic heterogeneity. That is, identifying related information sources for integration and querying purposes. Our approach considers the semantics of the meta-data constructs in resolving this issue. The major contributions of the thesis work include: (i) providing a scalable, easy-to-implement architecture for developing a heterogeneous multidatabase system, utilizing Semantic Binary Object-oriented Data Model (Sem-ODM) and Semantic SQL query language to capture the semantics of the data sources being integrated and to provide an easy-to-use query facility; (ii) a methodology for semantic heterogeneity resolution by investigating into the extents of the meta-data constructs of component schemas. This methodology is shown to be correct, complete and unambiguous; (iii) a semi-automated technique for identifying semantic relations, which is the basis of semantic knowledge for integration and querying, using shared ontologies for context-mediation; (iv) resolutions for schematic conflicts and a language for defining global views from a set of component Sem-ODM schemas; (v) design of a knowledge base for storing and manipulating meta-data and knowledge acquired during the integration process. This knowledge base acts as the interface between integration and query processing modules; (vi) techniques for Semantic SQL query processing and optimization based on semantic knowledge in a heterogeneous database environment; and (vii) a framework for intelligent computing and communication on the Internet applying the concepts of our work. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Today, databases have become an integral part of information systems. In the past two decades, we have seen different database systems being developed independently and used in different applications domains. Today's interconnected networks and advanced applications, such as data warehousing, data mining & knowledge discovery and intelligent data access to information on the Web, have created a need for integrated access to such heterogeneous, autonomous, distributed database systems. Heterogeneous/multidatabase research has focused on this issue resulting in many different approaches. However, a single, generally accepted methodology in academia or industry has not emerged providing ubiquitous intelligent data access from heterogeneous, autonomous, distributed information sources. This thesis describes a heterogeneous database system being developed at Highperformance Database Research Center (HPDRC). A major impediment to ubiquitous deployment of multidatabase technology is the difficulty in resolving semantic heterogeneity. That is, identifying related information sources for integration and querying purposes. Our approach considers the semantics of the meta-data constructs in resolving this issue. The major contributions of the thesis work include: (i.) providing a scalable, easy-to-implement architecture for developing a heterogeneous multidatabase system, utilizing Semantic Binary Object-oriented Data Model (Sem-ODM) and Semantic SQL query language to capture the semantics of the data sources being integrated and to provide an easy-to-use query facility; (ii.) a methodology for semantic heterogeneity resolution by investigating into the extents of the meta-data constructs of component schemas. This methodology is shown to be correct, complete and unambiguous; (iii.) a semi-automated technique for identifying semantic relations, which is the basis of semantic knowledge for integration and querying, using shared ontologies for context-mediation; (iv.) resolutions for schematic conflicts and a language for defining global views from a set of component Sem-ODM schemas; (v.) design of a knowledge base for storing and manipulating meta-data and knowledge acquired during the integration process. This knowledge base acts as the interface between integration and query processing modules; (vi.) techniques for Semantic SQL query processing and optimization based on semantic knowledge in a heterogeneous database environment; and (vii.) a framework for intelligent computing and communication on the Internet applying the concepts of our work.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Accurate estimation of road pavement geometry and layer material properties through the use of proper nondestructive testing and sensor technologies is essential for evaluating pavement’s structural condition and determining options for maintenance and rehabilitation. For these purposes, pavement deflection basins produced by the nondestructive Falling Weight Deflectometer (FWD) test data are commonly used. The nondestructive FWD test drops weights on the pavement to simulate traffic loads and measures the created pavement deflection basins. Backcalculation of pavement geometry and layer properties using FWD deflections is a difficult inverse problem, and the solution with conventional mathematical methods is often challenging due to the ill-posed nature of the problem. In this dissertation, a hybrid algorithm was developed to seek robust and fast solutions to this inverse problem. The algorithm is based on soft computing techniques, mainly Artificial Neural Networks (ANNs) and Genetic Algorithms (GAs) as well as the use of numerical analysis techniques to properly simulate the geomechanical system. A widely used pavement layered analysis program ILLI-PAVE was employed in the analyses of flexible pavements of various pavement types; including full-depth asphalt and conventional flexible pavements, were built on either lime stabilized soils or untreated subgrade. Nonlinear properties of the subgrade soil and the base course aggregate as transportation geomaterials were also considered. A computer program, Soft Computing Based System Identifier or SOFTSYS, was developed. In SOFTSYS, ANNs were used as surrogate models to provide faster solutions of the nonlinear finite element program ILLI-PAVE. The deflections obtained from FWD tests in the field were matched with the predictions obtained from the numerical simulations to develop SOFTSYS models. The solution to the inverse problem for multi-layered pavements is computationally hard to achieve and is often not feasible due to field variability and quality of the collected data. The primary difficulty in the analysis arises from the substantial increase in the degree of non-uniqueness of the mapping from the pavement layer parameters to the FWD deflections. The insensitivity of some layer properties lowered SOFTSYS model performances. Still, SOFTSYS models were shown to work effectively with the synthetic data obtained from ILLI-PAVE finite element solutions. In general, SOFTSYS solutions very closely matched the ILLI-PAVE mechanistic pavement analysis results. For SOFTSYS validation, field collected FWD data were successfully used to predict pavement layer thicknesses and layer moduli of in-service flexible pavements. Some of the very promising SOFTSYS results indicated average absolute errors on the order of 2%, 7%, and 4% for the Hot Mix Asphalt (HMA) thickness estimation of full-depth asphalt pavements, full-depth pavements on lime stabilized soils and conventional flexible pavements, respectively. The field validations of SOFTSYS data also produced meaningful results. The thickness data obtained from Ground Penetrating Radar testing matched reasonably well with predictions from SOFTSYS models. The differences observed in the HMA and lime stabilized soil layer thicknesses observed were attributed to deflection data variability from FWD tests. The backcalculated asphalt concrete layer thickness results matched better in the case of full-depth asphalt flexible pavements built on lime stabilized soils compared to conventional flexible pavements. Overall, SOFTSYS was capable of producing reliable thickness estimates despite the variability of field constructed asphalt layer thicknesses.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The design of supplementary damping controllers to mitigate the effects of electromechanical oscillations in power systems is a highly complex and time-consuming process, which requires a significant amount of knowledge from the part of the designer. In this study, the authors propose an automatic technique that takes the burden of tuning the controller parameters away from the power engineer and places it on the computer. Unlike other approaches that do the same based on robust control theories or evolutionary computing techniques, our proposed procedure uses an optimisation algorithm that works over a formulation of the classical tuning problem in terms of bilinear matrix inequalities. Using this formulation, it is possible to apply linear matrix inequality solvers to find a solution to the tuning problem via an iterative process, with the advantage that these solvers are widely available and have well-known convergence properties. The proposed algorithm is applied to tune the parameters of supplementary controllers for thyristor controlled series capacitors placed in the New England/New York benchmark test system, aiming at the improvement of the damping factor of inter-area modes, under several different operating conditions. The results of the linear analysis are validated by non-linear simulation and demonstrate the effectiveness of the proposed procedure.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Aquest projecte té com a objectiu participar en el desafiament d'RSA Laboratories corresponent a trencar el criptosistema RC5-32-12-9 proposat. Per realitzar-ho s'ha triat realitzar un atac per força bruta, mitjançant el càlcul distribuït i, més concretament, utilitzant la Public Resource Computing. La plataforma escollida és la Berkeley Open Infrastructure for Network Computing (BOINC), coneguda per la seva utilització en grans projectes com ara SETI@home. En aquest projecte es posa en funcionament la infraestructura i es desenvolupen les aplicacions necessàries per iniciar els càlculs que haurien de permetre el trencament del criptosistema.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The graphical representation of spatial soil properties in a digital environment is complex because it requires a conversion of data collected in a discrete form onto a continuous surface. The objective of this study was to apply three-dimension techniques of interpolation and visualization on soil texture and fertility properties and establish relationships with pedogenetic factors and processes in a slope area. The GRASS Geographic Information System was used to generate three-dimensional models and ParaView software to visualize soil volumes. Samples of the A, AB, BA, and B horizons were collected in a regular 122-point grid in an area of 13 ha, in Pinhais, PR, in southern Brazil. Geoprocessing and graphic computing techniques were effective in identifying and delimiting soil volumes of distinct ranges of fertility properties confined within the soil matrix. Both three-dimensional interpolation and the visualization tool facilitated interpretation in a continuous space (volumes) of the cause-effect relationships between soil texture and fertility properties and pedological factors and processes, such as higher clay contents following the drainage lines of the area. The flattest part with more weathered soils (Oxisols) had the highest pH values and lower Al3+ concentrations. These techniques of data interpolation and visualization have great potential for use in diverse areas of soil science, such as identification of soil volumes occurring side-by-side but that exhibit different physical, chemical, and mineralogical conditions for plant root growth, and monitoring of plumes of organic and inorganic pollutants in soils and sediments, among other applications. The methodological details for interpolation and a three-dimensional view of soil data are presented here.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Exposure to solar ultraviolet (UV) radiation is the main causative factor for skin cancer. UV exposure depends on environmental and individual factors, but individual exposure data remain scarce. UV irradiance is monitored via different techniques including ground measurements and satellite observations. However it is difficult to translate such observations into human UV exposure or dose because of confounding factors (shape of the exposed surface, shading, behavior, etc.) A collaboration between public health institutions, a meteorological office and an institute specialized in computing techniques developed a model predicting the dose and distribution of UV exposure on the basis of ground irradiation and morphological data. Standard 3D computer graphics techniques were adapted to develop this tool, which estimates solar exposure of a virtual manikin depicted as a triangle mesh surface. The amount of solar energy received by various body locations is computed for direct, diffuse and reflected radiation separately. The radiation components are deduced from corresponding measurements of UV irradiance, and the related UV dose received by each triangle of the virtual manikin is computed accounting for shading by other body parts and eventual protection measures. The model was verified with dosimetric measurements (n=54) in field conditions using a foam manikin as surrogate for an exposed individual. Dosimetric results were compared to the model predictions. The model predicted exposure to solar UV adequately. The symmetric mean absolute percentage error was 13%. Half of the predictions were within 17% range of the measurements. This model allows assessing outdoor occupational and recreational UV exposures, without necessitating time-consuming individual dosimetry, with numerous potential uses in skin cancer prevention and research. Using this tool, we investigated solar UV exposure patterns with respect to the relative contribution of the direct, diffuse and reflected radiation. We assessed exposure doses for various body parts and exposure scenarios of a standing individual (static and dynamic postures). As input, the model used erythemally-weighted ground irradiance data measured in 2009 at Payerne, Switzerland. A year-round daily exposure (8 am to 5 pm) without protection was assumed. For most anatomical sites, mean daily doses were high (typically 6.2-14.6 SED) and exceeded recommended exposure values. Direct exposure was important during specific periods (e.g. midday during summer), but contributed moderately to the annual dose, ranging from 15 to 24% for vertical and horizontal body parts, respectively. Diffuse irradiation explained about 80% of the cumulative annual exposure dose. Acute diffuse exposures were also obtained for cloudy summer days. The importance of diffuse UV radiation should not be underestimated when advocating preventive measures. Messages focused on avoiding acute direct exposures may be of limited efficiency to prevent skin cancers associated with chronic exposure (e.g., squamous cell carcinomas).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Alzheimer׳s disease (AD) is the most common type of dementia among the elderly. This work is part of a larger study that aims to identify novel technologies and biomarkers or features for the early detection of AD and its degree of severity. The diagnosis is made by analyzing several biomarkers and conducting a variety of tests (although only a post-mortem examination of the patients’ brain tissue is considered to provide definitive confirmation). Non-invasive intelligent diagnosis techniques would be a very valuable diagnostic aid. This paper concerns the Automatic Analysis of Emotional Response (AAER) in spontaneous speech based on classical and new emotional speech features: Emotional Temperature (ET) and fractal dimension (FD). This is a pre-clinical study aiming to validate tests and biomarkers for future diagnostic use. The method has the great advantage of being non-invasive, low cost, and without any side effects. The AAER shows very promising results for the definition of features useful in the early diagnosis of AD.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

RMS measuring device is a nonlinear device consisting of linear and nonlinear devices. The performance of rms measurement is influenced by a number of factors; i) signal characteristics, 2) the measurement technique used and 3) the device characteristics. RMS measurement is not simple, particularly when the signals are complex and unknown. The problem of rms measurement on high crest-factor signals is fully discussed and a solution to this problem is presented in this thesis. The problem of rms measurement is systematically analized and found to have mainly three types of errors: (1) amplitude or waveform error 2) Frequency error and (3) averaging error. Various rms measurement techniques are studied and compared. On the basis of this study the rms -measurement is reclassified three categories: (1) Wave-form-error-free measurement (2) High-frequncy-error measurement and (3) Low-frequency error-free measurement. In modern digital sampled-data systems the signals are complex and waveform-error-free rms measurement is highly appreciated. Among the three basic blocks of rms measuring device the squarer is the most important one. A squaring technique is selected, that permits shaping of the squarer error characteristic in such a way as to achieve waveform-errob free rms measurement. The squarer is designed, fabricated and tested. A hybrid rms measurement using an analog rms computing device and digital display combines the speed of analog techniques and the resolution and ease of measurement of digital techniques. An A/D converter is modified to perform the square-rooting operation. A 10-V rms voltmeter using the developed rms detector is fabricated and tested. The chapters two, three and four analyse the problems involved in rms measurement and present a comparative study of rms computing techniques and devices. The fifth chapter gives the details of the developed rms detector that permits wave-form-error free rms measurement. The sixth chapter, enumerates the the highlights of the thesis and suggests a list of future projects

Relevância:

80.00% 80.00%

Publicador:

Resumo:

On-line handwriting recognition has been a frontier area of research for the last few decades under the purview of pattern recognition. Word processing turns to be a vexing experience even if it is with the assistance of an alphanumeric keyboard in Indian languages. A natural solution for this problem is offered through online character recognition. There is abundant literature on the handwriting recognition of western, Chinese and Japanese scripts, but there are very few related to the recognition of Indic script such as Malayalam. This paper presents an efficient Online Handwritten character Recognition System for Malayalam Characters (OHR-M) using K-NN algorithm. It would help in recognizing Malayalam text entered using pen-like devices. A novel feature extraction method, a combination of time domain features and dynamic representation of writing direction along with its curvature is used for recognizing Malayalam characters. This writer independent system gives an excellent accuracy of 98.125% with recognition time of 15-30 milliseconds

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper proposes the deployment of a neural network computing environment on Active Networks. Active Networks are packet-switched computer networks in which packets can contain code fragments that are executed on the intermediate nodes. This feature allows the injection of small pieces of codes to deal with computer network problems directly into the network core, and the adoption of new computing techniques to solve networking problems. The goal of our project is the adoption of a distributed neural network for approaching tasks which are specific of the computer network environment. Dynamically reconfigurable neural networks are spread on an experimental wide area backbone of active nodes (ABone) to show the feasibility of the proposed approach.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose: The purpose of this paper is to address a classic problem – pattern formation identified by researchers in the area of swarm robotic systems – and is also motivated by the need for mathematical foundations in swarm systems. Design/methodology/approach: The work is separated out as inspirations, applications, definitions, challenges and classifications of pattern formation in swarm systems based on recent literature. Further, the work proposes a mathematical model for swarm pattern formation and transformation. Findings: A swarm pattern formation model based on mathematical foundations and macroscopic primitives is proposed. A formal definition for swarm pattern transformation and four special cases of transformation are introduced. Two general methods for transforming patterns are investigated and a comparison of the two methods is presented. The validity of the proposed models, and the feasibility of the methods investigated are confirmed on the Traer Physics and Processing environment. Originality/value: This paper helps in understanding the limitations of existing research in pattern formation and the lack of mathematical foundations for swarm systems. The mathematical model and transformation methods introduce two key concepts, namely macroscopic primitives and a mathematical model. The exercise of implementing the proposed models on physics simulator is novel.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Aircraft Maintenance, Repair and Overhaul (MRO) feedback commonly includes an engineer’s complex text-based inspection report. Capturing and normalizing the content of these textual descriptions is vital to cost and quality benchmarking, and provides information to facilitate continuous improvement of MRO process and analytics. As data analysis and mining tools requires highly normalized data, raw textual data is inadequate. This paper offers a textual-mining solution to efficiently analyse bulk textual feedback data. Despite replacement of the same parts and/or sub-parts, the actual service cost for the same repair is often distinctly different from similar previously jobs. Regular expression algorithms were incorporated with an aircraft MRO glossary dictionary in order to help provide additional information concerning the reason for cost variation. Professional terms and conventions were included within the dictionary to avoid ambiguity and improve the outcome of the result. Testing results show that most descriptive inspection reports can be appropriately interpreted, allowing extraction of highly normalized data. This additional normalized data strongly supports data analysis and data mining, whilst also increasing the accuracy of future quotation costing. This solution has been effectively used by a large aircraft MRO agency with positive results.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A system identification algorithm is introduced for Hammerstein systems that are modelled using a non-uniform rational B-spline (NURB) neural network. The proposed algorithm consists of two successive stages. First the shaping parameters in NURB network are estimated using a particle swarm optimization (PSO) procedure. Then the remaining parameters are estimated by the method of the singular value decomposition (SVD). Numerical examples are utilized to demonstrate the efficacy of the proposed approach.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A new PID tuning and controller approach is introduced for Hammerstein systems based on input/output data. A B-spline neural network is used to model the nonlinear static function in the Hammerstein system. The control signal is composed of a PID controller together with a correction term. In order to update the control signal, the multistep ahead predictions of the Hammerstein system based on the B-spline neural networks and the associated Jacobians matrix are calculated using the De Boor algorithms including both the functional and derivative recursions. A numerical example is utilized to demonstrate the efficacy of the proposed approaches.