915 resultados para Computer Aided Process


Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the developed world we are surrounded by man-made objects, but most people give little thought to the complex processes needed for their design. The design of hand knitting is complex because much of the domain knowledge is tacit. The objective of this thesis is to devise a methodology to help designers to work within design constraints, whilst facilitating creativity. A hybrid solution including computer aided design (CAD) and case based reasoning (CBR) is proposed. The CAD system creates designs using domain-specific rules and these designs are employed for initial seeding of the case base and the management of constraints. CBR reuses the designer's previous experience. The key aspects in the CBR system are measuring the similarity of cases and adapting past solutions to the current problem. Similarity is measured by asking the user to rank the importance of features; the ranks are then used to calculate weights for an algorithm which compares the specifications of designs. A novel adaptation operator called rule difference replay (RDR) is created. When the specifications to a new design is presented, the CAD program uses it to construct a design constituting an approximate solution. The most similar design from the case-base is then retrieved and RDR replays the changes previously made to the retrieved design on the new solution. A measure of solution similarity that can validate subjective success scores is created. Specification similarity can be used as a guide whether to invoke CBR, in a hybrid CAD-CBR system. If the newly resulted design is suffciently similar to a previous design, then CBR is invoked; otherwise CAD is used. The application of RDR to knitwear design has demonstrated the flexibility to overcome deficiencies in rules that try to automate creativity, and has the potential to be applied to other domains such as interior design.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The thesis presents a theoretical and practical study of the dynamic behaviour of electromagnetic relays. After discussing the problem of solving the dynamicc equations analytically and presenting a historical survey of the earlier works in the relay and its dynamics, the simulation of a relay on the analogue computer is discussed. It is shown that the simulation may be used to obtain specific solutions to the dynamic equations. The computer analysis provides the dynamic characteristics for design purposes and may be used in the study of bouncing, rebound oscillations and stability of the armature motion. An approximate analytical solution to the two dynamic equations is given based on the assumption that the dynamic variation of the pull with the position of the armature is linear. The assumption is supported by the Computer-aided analysis and experimental results. The solution is intended to provide a basis for a rational design. A rigorous method of analysing the dynamic performance by using Ahlberg's theory is also presented. This method may be justified to be the extension of Ahlberg's theory by taking the mass and frictional damping forces into account. While calculating the armature motion mathematically, Ahlberg considers the equilibrium of two kinds of forces, namely pull and load, and disregards the mass and friction forces, whereas the present method deals with the equilibrium of all four kinds of forces. It is shown how this can be utilised to calculate the dynamic characteristics for a specific design. The utility of this method also extends to the study of stability, contact bounce and armature rebound. The magnetic circuit and other related topics which are essential to the study of relay dynamics are discussed and some necessary experimental results are given.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An eMathTeacher [Sánchez-Torrubia 2007a] is an eLearning on line self assessment tool that help students to active learning math algorithms by themselves, correcting their mistakes and providing them with clues to find the right solution. The tool presented in this paper is an example of this new concept on Computer Aided Instruction (CAI) resources and has been implemented as a Java applet and designed as an auxiliary instrument for both classroom teaching and individual practicing of Fleury’s algorithm. This tool, included within a set of eMathTeacher tools, has been designed as educational complement of Graph Algorithm active learning for first course students. Its characteristics of visualization, simplicity and interactivity, make this tutorial a great value pedagogical instrument.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis describes advances in the characterisation, calibration and data processing of optical coherence tomography (OCT) systems. Femtosecond (fs) laser inscription was used for producing OCT-phantoms. Transparent materials are generally inert to infra-red radiations, but with fs lasers material modification occurs via non-linear processes when the highly focused light source interacts with the materials. This modification is confined to the focal volume and is highly reproducible. In order to select the best inscription parameters, combination of different inscription parameters were tested, using three fs laser systems, with different operating properties, on a variety of materials. This facilitated the understanding of the key characteristics of the produced structures with the aim of producing viable OCT-phantoms. Finally, OCT-phantoms were successfully designed and fabricated in fused silica. The use of these phantoms to characterise many properties (resolution, distortion, sensitivity decay, scan linearity) of an OCT system was demonstrated. Quantitative methods were developed to support the characterisation of an OCT system collecting images from phantoms and also to improve the quality of the OCT images. Characterisation methods include the measurement of the spatially variant resolution (point spread function (PSF) and modulation transfer function (MTF)), sensitivity and distortion. Processing of OCT data is a computer intensive process. Standard central processing unit (CPU) based processing might take several minutes to a few hours to process acquired data, thus data processing is a significant bottleneck. An alternative choice is to use expensive hardware-based processing such as field programmable gate arrays (FPGAs). However, recently graphics processing unit (GPU) based data processing methods have been developed to minimize this data processing and rendering time. These processing techniques include standard-processing methods which includes a set of algorithms to process the raw data (interference) obtained by the detector and generate A-scans. The work presented here describes accelerated data processing and post processing techniques for OCT systems. The GPU based processing developed, during the PhD, was later implemented into a custom built Fourier domain optical coherence tomography (FD-OCT) system. This system currently processes and renders data in real time. Processing throughput of this system is currently limited by the camera capture rate. OCTphantoms have been heavily used for the qualitative characterization and adjustment/ fine tuning of the operating conditions of OCT system. Currently, investigations are under way to characterize OCT systems using our phantoms. The work presented in this thesis demonstrate several novel techniques of fabricating OCT-phantoms and accelerating OCT data processing using GPUs. In the process of developing phantoms and quantitative methods, a thorough understanding and practical knowledge of OCT and fs laser processing systems was developed. This understanding leads to several novel pieces of research that are not only relevant to OCT but have broader importance. For example, extensive understanding of the properties of fs inscribed structures will be useful in other photonic application such as making of phase mask, wave guides and microfluidic channels. Acceleration of data processing with GPUs is also useful in other fields.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

With the development of the Internet culture applications are becoming simpler and simpler, users need less IT knowledge than earlier; from the ‘reader’ status they have reached that of the content creator and editor. In our days, the effects of the web are becoming stronger and stronger— computer-aided work is conventional almost everywhere. The spread of the Internet applications has several reasons: first of all, their accessibility is widespread; second, their use is not limited to only one computer or network on which they have been installed. Also, the quantity of accessible information now and earlier is not even comparable. Not counting the applications which need high broadband or high counting capacity (for example video editing), Internet applications are reaching the functionality of the thick clients associates. The most serious disadvantage of Internet applications – for security reasons — is that the resources of the client computer are not fully accessible or accessible only to a restricted extent. Still thick clients do have some advantages: better multimedia perdormance with more flexibility due to local resources and the possibility for offline working.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BOOK REVIEWS Multibody System Mechanics: Modelling, Stability, Control, and Ro- bustness, by V. A. Konoplev and A. Cheremensky, Mathematics and its Appli- cations Vol. 1, Union of Bulgarian Mathematicians, Sofia, 2001, XXII + 288 pp., $ 65.00, ISBN 954-8880-09-01

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Presented is webComputing – a general framework of mathematically oriented services including remote access to hardware and software resources for mathematical computations, and web interface to dynamic interactive computations and visualization in a diversity of contexts: mathematical research and engineering, computer-aided mathematical/technical education and distance learning. webComputing builds on the innovative webMathematica technology connecting technical computing system Mathematica to a web server and providing tools for building dynamic and interactive web-interface to Mathematica-based functionality. Discussed are the conception and some of the major components of webComputing service: Scientific Visualization, Domain- Specific Computations, Interactive Education, and Authoring of Interactive Pages.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Micro Electro Mechanical Systems (MEMS) have already revolutionized several industries through miniaturization and cost effective manufacturing capabilities that were never possible before. However, commercially available MEMS products have only scratched the surface of the application areas where MEMS has potential. The complex and highly technical nature of MEMS research and development (R&D) combined with the lack of standards in areas such as design, fabrication and test methodologies, makes creating and supporting a MEMS R&D program a financial and technological challenge. A proper information technology (IT) infrastructure is the backbone of such research and is critical to its success. While the lack of standards and the general complexity in MEMS R&D makes it impossible to provide a “one size fits all” design, a systematic approach, combined with a good understanding of the MEMS R&D environment and the relevant computer-aided design tools, provides a way for the IT architect to develop an appropriate infrastructure.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Decision making and technical decision analysis demand computer-aided techniques and therefore more and more support by formal techniques. In recent years fuzzy decision analysis and related techniques gained importance as an efficient method for planning and optimization applications in fields like production planning, financial and economical modeling and forecasting or classification. It is also known, that the hierarchical modeling of the situation is one of the most popular modeling method. It is shown, how to use the fuzzy hierarchical model in complex with other methods of Multiple Criteria Decision Making. We propose a novel approach to overcome the inherent limitations of Hierarchical Methods by exploiting multiple criteria decision making.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

В статье рассмотрена технология решения задачи комплектования аварийно- спасательной техники с использованием многокритериальной оптимизации, последовательного анализа вариантов и эволюционного моделирования. Разработаны модели, служащие информационно- аналитическим базисом формирования интегрального критерия.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

В статье рассмотрены особенности проектирования системы поддержки принятия решений «Безопасность», предназначенной для информационно-консультативного сопровождения процессов принятия решений руководителями пожарных подразделений во время тушения пожара.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the article it is considered preconditions and main principles of creation of virtual laboratories for computer-aided design, as tools for interdisciplinary researches. Virtual laboratory, what are offered, is worth to be used on the stage of the requirements specification or EFT-stage, because it gives the possibility of fast estimating of the project realization, certain characteristics and, as a result, expected benefit of its applications. Using of these technologies already increase automation level of design stages of new devices for different purposes. Proposed computer technology gives possibility to specialists from such scientific fields, as chemistry, biology, biochemistry, physics etc, to check possibility of device creating on the basis of developed sensors. It lets to reduce terms and costs of designing of computer devices and systems on the early stages of designing, for example on the stage of requirements specification or EFT-stage. An important feature of this project is using the advanced multi-dimensional access method for organizing the information base of the Virtual laboratory.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Hydrogen bonds play important roles in maintaining the structure of proteins and in the formation of most biomolecular protein-ligand complexes. All amino acids can act as hydrogen bond donors and acceptors. Among amino acids, Histidine is unique, as it can exist in neutral or positively charged forms within the physiological pH range of 5.0 to 7.0. Histidine can thus interact with other aromatic residues as well as forming hydrogen bonds with polar and charged residues. The ability of His to exchange a proton lies at the heart of many important functional biomolecular interactions, including immunological ones. By using molecular docking and molecular dynamics simulation, we examine the influence of His protonation/deprotonation on peptide binding affinity to MHC class II proteins from locus HLA-DP. Peptide-MHC interaction underlies the adaptive cellular immune response, upon which the next generation of commercially-important vaccines will depend. Consistent with experiment, we find that peptides containing protonated His residues bind better to HLA-DP proteins than those with unprotonated His. Enhanced binding at pH 5.0 is due, in part, to additional hydrogen bonds formed between peptide His+ and DP proteins. In acidic endosomes, protein His79β is predominantly protonated. As a result, the peptide binding cleft narrows in the vicinity of His79β, which stabilizes the peptide - HLA-DP protein complex. © 2014 Bentham Science Publishers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The rapid growth of the Internet and the advancements of the Web technologies have made it possible for users to have access to large amounts of on-line music data, including music acoustic signals, lyrics, style/mood labels, and user-assigned tags. The progress has made music listening more fun, but has raised an issue of how to organize this data, and more generally, how computer programs can assist users in their music experience. An important subject in computer-aided music listening is music retrieval, i.e., the issue of efficiently helping users in locating the music they are looking for. Traditionally, songs were organized in a hierarchical structure such as genre->artist->album->track, to facilitate the users’ navigation. However, the intentions of the users are often hard to be captured in such a simply organized structure. The users may want to listen to music of a particular mood, style or topic; and/or any songs similar to some given music samples. This motivated us to work on user-centric music retrieval system to improve users’ satisfaction with the system. The traditional music information retrieval research was mainly concerned with classification, clustering, identification, and similarity search of acoustic data of music by way of feature extraction algorithms and machine learning techniques. More recently the music information retrieval research has focused on utilizing other types of data, such as lyrics, user-access patterns, and user-defined tags, and on targeting non-genre categories for classification, such as mood labels and styles. This dissertation focused on investigating and developing effective data mining techniques for (1) organizing and annotating music data with styles, moods and user-assigned tags; (2) performing effective analysis of music data with features from diverse information sources; and (3) recommending music songs to the users utilizing both content features and user access patterns.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Lung cancer is one of the most common types of cancer and has the highest mortality rate. Patient survival is highly correlated with early detection. Computed Tomography technology services the early detection of lung cancer tremendously by offering aminimally invasive medical diagnostic tool. However, the large amount of data per examination makes the interpretation difficult. This leads to omission of nodules by human radiologist. This thesis presents a development of a computer-aided diagnosis system (CADe) tool for the detection of lung nodules in Computed Tomography study. The system, called LCD-OpenPACS (Lung Cancer Detection - OpenPACS) should be integrated into the OpenPACS system and have all the requirements for use in the workflow of health facilities belonging to the SUS (Brazilian health system). The LCD-OpenPACS made use of image processing techniques (Region Growing and Watershed), feature extraction (Histogram of Gradient Oriented), dimensionality reduction (Principal Component Analysis) and classifier (Support Vector Machine). System was tested on 220 cases, totaling 296 pulmonary nodules, with sensitivity of 94.4% and 7.04 false positives per case. The total time for processing was approximately 10 minutes per case. The system has detected pulmonary nodules (solitary, juxtavascular, ground-glass opacity and juxtapleural) between 3 mm and 30 mm.