820 resultados para Computer-Aided Engineering
Resumo:
The term design in this paper particularly refers to the process (verb) and less-to the outcome or product. Design comprises a complex set of activities today involving both man and machine. Sustainability is a fundamental paradigm and carries significance in any process, natural or manmade, and its outcome. In simple terms, sustainability implies a state of sustainable living, viz, health and continuity, nurtured by diversity and evolution (innovations) in an ever-changing world. Design, in a similar line, has been comprehensively investigated and its current manifestations including design-aids (Computer Aided Design) have been evaluated in terms of sustainability. The paper investigates the rationale of sustainability to design as a whole - its purpose, its adoption in the natural world, its relevance to humankind and the technologies involved. Throughout its history, technology has been used to aid design. But in the current context of advanced algorithms and computational capacity, design no longer remains an exclusively animate faculty. Given this scenario, investigating sustainability in the light of advanced design aids such as CAD becomes pertinent. Considering that technology plays a part in design activities, the paper explores where technology must play a part and to what degree amongst the various activities that comprise design. The study includes an examination of the morphology of design and the development of a systems-thinking integrated forecasting model to evaluate the implications of CAD tools in design and sustainability. The results of the study along with a broad range of recommendations have been presented. (c) 2012 Elsevier Ltd. All rights reserved.
Resumo:
Flap dynamics of HIV-1 protease (HIV-pr) controls the entry of inhibitors and substrates to the active site. Dynamical models from previous simulations are not all consistent with each other and not all are supported by the NMR results. In the present work, the er effect of force field on the dynamics of HIV-pr is investigated by MD simulations using three AMBER force fields ff99, ff99SB, and ff03. The generalized order parameters for amide backbone are calculated from the three force fields and compared with the NMR S2 values. We found that the ff99SB and ff03 force field calculated order parameters agree reasonably well with the NMR S2 values, whereas ff99 calculated values deviate most from the NMR order parameters. Stereochemical geometry of protein models from each force field also agrees well with the remarks from NMR S2 values. However, between ff99SB and ff03, there are several differences, most notably in the loop regions. It is found that these loops are, in general, more flexible in the ff03 force field. This results in a larger active site cavity in the simulation with the ff03 force field. The effect of this difference in computer-aided drug design against flexible receptors is discussed.
Resumo:
This paper presents classification, representation and extraction of deformation features in sheet-metal parts. The thickness is constant for these shape features and hence these are also referred to as constant thickness features. The deformation feature is represented as a set of faces with a characteristic arrangement among the faces. Deformation of the base-sheet or forming of material creates Bends and Walls with respect to a base-sheet or a reference plane. These are referred to as Basic Deformation Features (BDFs). Compound deformation features having two or more BDFs are defined as characteristic combinations of Bends and Walls and represented as a graph called Basic Deformation Features Graph (BDFG). The graph, therefore, represents a compound deformation feature uniquely. The characteristic arrangement of the faces and type of bends belonging to the feature decide the type and nature of the deformation feature. Algorithms have been developed to extract and identify deformation features from a CAD model of sheet-metal parts. The proposed algorithm does not require folding and unfolding of the part as intermediate steps to recognize deformation features. Representations of typical features are illustrated and results of extracting these deformation features from typical sheet metal parts are presented and discussed. (C) 2013 Elsevier Ltd. All rights reserved.
Resumo:
Tuberculosis (TB) is a life threatening disease caused due to infection from Mycobacterium tuberculosis (Mtb). That most of the TB strains have become resistant to various existing drugs, development of effective novel drug candidates to combat this disease is a need of the day. In spite of intensive research world-wide, the success rate of discovering a new anti-TB drug is very poor. Therefore, novel drug discovery methods have to be tried. We have used a rule based computational method that utilizes a vertex index, named `distance exponent index (D-x)' (taken x = -4 here) for predicting anti-TB activity of a series of acid alkyl ester derivatives. The method is meant to identify activity related substructures from a series a compounds and predict activity of a compound on that basis. The high degree of successful prediction in the present study suggests that the said method may be useful in discovering effective anti-TB compound. It is also apparent that substructural approaches may be leveraged for wide purposes in computer-aided drug design.
Resumo:
In the domain of manual mechanical assembly, expert knowledge is an important means of supporting assembly planning that leads to fewer issues during actual assembly. Knowledge based systems can be used to provide assembly planners with expert knowledge as advice. However, acquisition of knowledge remains a difficult task to automate, while manual acquisition is tedious, time-consuming, and requires engagement of knowledge engineers with specialist knowledge to understand and translate expert knowledge. This paper describes the development, implementation and preliminary evaluation of a method that asks a series of questions to an expert, so as to automatically acquire necessary diagnostic and remedial knowledge as rules for use in a knowledge based system for advising assembly planners diagnose and resolve issues. The method, called a questioning procedure, organizes its questions around an assembly situation which it presents to the expert as the context, and adapts its questions based on the answers it receives from the expert. (C) 2014 Elsevier Ltd. All rights reserved.
Resumo:
While absorption and emission spectroscopy have always been used to detect and characterize molecules and molecular complexes, the availability of ultrashort laser pulses and associated computer-aided optical detection techniques allowed study of chemical processes directly in the time domain at unprecedented time scales, through appearance and disappearance of fluorescence from participating chemical species. Application of such techniques to chemical dynamics in liquids, where many processes occur with picosecond and femtosecond time scales lead to the discovery of a host of new phenomena that in turn led to the development of many new theories. Experiment and theory together provided new and valuable insight into many fundamental chemical processes, like isomerization dynamics, electron and proton transfer reactions, vibrational energy and phase relaxation, photosynthesis, to name just a few. In this article, we shall review a few of such discoveries in attempt to provide a glimpse of the fascinating research employing fluorescence spectroscopy that changed the field of chemical dynamics forever.
Resumo:
[ES] El proyecto estudia algoritmos de detección de bordes aplicados a imágenes fotográficas y procedentes de nubes de puntos, posteriormente combina los resultados y analiza las posibilidades de mejora de la solución conjunta.
Resumo:
In the last decades big improvements have been done in the field of computer aided learning, based on improvements done in computer science and computer systems. Although the field has been always a bit lagged, without using the latest solutions, it has constantly gone forward taking profit of the innovations as they show up. As long as the train of the computer science does not stop (and it won’t at least in the near future) the systems that take profit of those improvements will not either, because we humans will always need to study; Sometimes for pleasure and some other many times out of need. Not all the attempts in the field of computer aided learning have been in the same direction. Most of them address one or some few of the problems that show while studying and don’t take into account solutions proposed for some other problems. The reasons for this can be varied. Sometimes the solutions simply are not compatible. Some other times, because the project is an investigation it’s interesting to isolate the problem. And, in commercial products, licenses and patents often prevent the new projects to use previous work. The world moved forward and this is an attempt to use some of the options offered by technology, mixing some old ideas with new ones.
Resumo:
The implementation of various types of marine protected areas is one of several management tools available for conserving representative examples of the biological diversity within marine ecosystems in general and National Marine Sanctuaries in particular. However, deciding where and how many sites to establish within a given area is frequently hampered by incomplete knowledge of the distribution of organisms and an understanding of the potential tradeoffs that would allow planners to address frequently competing interests in an objective manner. Fortunately, this is beginning to change. Recent studies on the continental shelf of the northeastern United States suggest that substrate and water mass characteristics are highly correlated with the composition of benthic communities and may therefore, serve as proxies for the distribution of biological biodiversity. A detailed geo-referenced interpretative map of major sediment types within Stellwagen Bank National Marine Sanctuary (SBNMS) has recently been developed, and computer-aided decision support tools have reached new levels of sophistication. We demonstrate the use of simulated annealing, a type of mathematical optimization, to identify suites of potential conservation sites within SBNMS that equally represent 1) all major sediment types and 2) derived habitat types based on both sediment and depth in the smallest amount of space. The Sanctuary was divided into 3610 0.5 min2 sampling units. Simulations incorporated constraints on the physical dispersion of sampling units to varying degrees such that solutions included between one and four site clusters. Target representation goals were set at 5, 10, 15, 20, and 25 percent of each sediment type, and 10 and 20 percent of each habitat type. Simulations consisted of 100 runs, from which we identified the best solution (i.e., smallest total area) and four nearoptimal alternates. We also plotted total instances in which each sampling unit occurred in solution sets of the 100 runs as a means of gauging the variety of spatial configurations available under each scenario. Results suggested that the total combined area needed to represent each of the sediment types in equal proportions was equal to the percent representation level sought. Slightly larger areas were required to represent all habitat types at the same representation levels. Total boundary length increased in direct proportion to the number of sites at all levels of representation for simulations involving sediment and habitat classes, but increased more rapidly with number of sites at higher representation levels. There were a large number of alternate spatial configurations at all representation levels, although generally fewer among one and two versus three- and four-site solutions. These differences were less pronounced among simulations targeting habitat representation, suggesting that a similar degree of flexibility is inherent in the spatial arrangement of potential protected area systems containing one versus several sites for similar levels of habitat representation. We attribute these results to the distribution of sediment and depth zones within the Sanctuary, and to the fact that even levels of representation were sought in each scenario. (PDF contains 33 pages.)
Resumo:
For 10 years the Institute for Fishing Technology, Hamburg (IFH) has been carrying out experiments in the brown shrimp fishery with beam trawls aiming at a reduction of unwanted bycatches. When the tests were transferred to commercial fishery conditions the personnel effort and costs increased markedly. It became e.g. necessary to install a deep-freeze chain to make it possible to evaluate more samples in the laboratory. This again required to increase the number of technicians for measuring the fish and shrimp samples, but also made it necessary to perform this work in the most rational and time-saving way by applying modern electronic aids. Though all samples still have to be sorted by species and have to be weighed and measured the introduction of electronic aids, however, like electronic measuring board and computer-aided image processing system, all weight and length data are immediately and digitally recorded after processing. They are transferred via a network to a server PC which stores them into a purpose-designed database. This article describes the applicationof two electronic systems: the measuring board (FM 100, Fa. SCANTROL), iniated by a project in the Norwegian Institute for Fishing Technology, and a computer-aided image processing system, focussing on measuring shrimps in their naturally flexed shape, also developed in the Institute for Fishing Technology in close collaboration with the University of Duisburg. These electronic recording systems allow the consistent and reproducible record of data independent of the changing day-to-day personal form of the staff operating them. With the help of these systems the number of measurements the laboratory could be maximized to 250 000 per year. This made it possible to evaluate, in 1999, 525 catch samples from 75 commercial hauls taken during 15 days at sea. The time gain in measuring the samples is about one third of the time previously needed (i.e. one hour per sample). An additional advantage is the immediate availability of the digitally stored data which enables rapid analyses of all finished subexperiments. Both systems are applied today in several institutes of the Federal Research Centre. The image processing system is now the standard measuring method in an international research project.
Resumo:
This thesis explores the design, construction, and applications of the optoelectronic swept-frequency laser (SFL). The optoelectronic SFL is a feedback loop designed around a swept-frequency (chirped) semiconductor laser (SCL) to control its instantaneous optical frequency, such that the chirp characteristics are determined solely by a reference electronic oscillator. The resultant system generates precisely controlled optical frequency sweeps. In particular, we focus on linear chirps because of their numerous applications. We demonstrate optoelectronic SFLs based on vertical-cavity surface-emitting lasers (VCSELs) and distributed-feedback lasers (DFBs) at wavelengths of 1550 nm and 1060 nm. We develop an iterative bias current predistortion procedure that enables SFL operation at very high chirp rates, up to 10^16 Hz/sec. We describe commercialization efforts and implementation of the predistortion algorithm in a stand-alone embedded environment, undertaken as part of our collaboration with Telaris, Inc. We demonstrate frequency-modulated continuous-wave (FMCW) ranging and three-dimensional (3-D) imaging using a 1550 nm optoelectronic SFL.
We develop the technique of multiple source FMCW (MS-FMCW) reflectometry, in which the frequency sweeps of multiple SFLs are "stitched" together in order to increase the optical bandwidth, and hence improve the axial resolution, of an FMCW ranging measurement. We demonstrate computer-aided stitching of DFB and VCSEL sweeps at 1550 nm. We also develop and demonstrate hardware stitching, which enables MS-FMCW ranging without additional signal processing. The culmination of this work is the hardware stitching of four VCSELs at 1550 nm for a total optical bandwidth of 2 THz, and a free-space axial resolution of 75 microns.
We describe our work on the tomographic imaging camera (TomICam), a 3-D imaging system based on FMCW ranging that features non-mechanical acquisition of transverse pixels. Our approach uses a combination of electronically tuned optical sources and low-cost full-field detector arrays, completely eliminating the need for moving parts traditionally employed in 3-D imaging. We describe the basic TomICam principle, and demonstrate single-pixel TomICam ranging in a proof-of-concept experiment. We also discuss the application of compressive sensing (CS) to the TomICam platform, and perform a series of numerical simulations. These simulations show that tenfold compression is feasible in CS TomICam, which effectively improves the volume acquisition speed by a factor ten.
We develop chirped-wave phase-locking techniques, and apply them to coherent beam combining (CBC) of chirped-seed amplifiers (CSAs) in a master oscillator power amplifier configuration. The precise chirp linearity of the optoelectronic SFL enables non-mechanical compensation of optical delays using acousto-optic frequency shifters, and its high chirp rate simultaneously increases the stimulated Brillouin scattering (SBS) threshold of the active fiber. We characterize a 1550 nm chirped-seed amplifier coherent-combining system. We use a chirp rate of 5*10^14 Hz/sec to increase the amplifier SBS threshold threefold, when compared to a single-frequency seed. We demonstrate efficient phase-locking and electronic beam steering of two 3 W erbium-doped fiber amplifier channels, achieving temporal phase noise levels corresponding to interferometric fringe visibilities exceeding 98%.