908 resultados para Computer aided design tool


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A multi-channel complex machine tool (MCCM) is a versatile machining system equipped with more than two spindles and turrets for both turning and milling operations. Despite the potential of such a tool, the value of the hardware is largely dependent on how the machine tools are effectively programmed for machining. In this paper we consider a shop-floor programming system based on ISO 14649 (called e-CAM), the international standard for the interface between computer-aided manufacture (CAM) and computer numerical control (CNC). To be deployed in practical industrial usage a great deal of research has to be carried out. In this paper we present: 1) Design consideration for an e-CAM system, 2) The architecture design of e-CAM, 3) Major algorithms to fulfill the modules defined in the architecture, and 4) Implementation details.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

NetSketch is a tool that enables the specification of network-flow applications and the certification of desirable safety properties imposed thereon. NetSketch is conceived to assist system integrators in two types of activities: modeling and design. As a modeling tool, it enables the abstraction of an existing system so as to retain sufficient enough details to enable future analysis of safety properties. As a design tool, NetSketch enables the exploration of alternative safe designs as well as the identification of minimal requirements for outsourced subsystems. NetSketch embodies a lightweight formal verification philosophy, whereby the power (but not the heavy machinery) of a rigorous formalism is made accessible to users via a friendly interface. NetSketch does so by exposing tradeoffs between exactness of analysis and scalability, and by combining traditional whole-system analysis with a more flexible compositional analysis approach based on a strongly-typed, Domain-Specific Language (DSL) to specify network configurations at various levels of sketchiness along with invariants that need to be enforced thereupon. In this paper, we overview NetSketch, highlight its salient features, and illustrate how it could be used in applications, including the management/shaping of traffic flows in a vehicular network (as a proxy for CPS applications) and in a streaming media network (as a proxy for Internet applications). In a companion paper, we define the formal system underlying the operation of NetSketch, in particular the DSL behind NetSketch's user-interface when used in "sketch mode", and prove its soundness relative to appropriately-defined notions of validity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work we introduce a new mathematical tool for optimization of routes, topology design, and energy efficiency in wireless sensor networks. We introduce a vector field formulation that models communication in the network, and routing is performed in the direction of this vector field at every location of the network. The magnitude of the vector field at every location represents the density of amount of data that is being transited through that location. We define the total communication cost in the network as the integral of a quadratic form of the vector field over the network area. With the above formulation, we introduce a mathematical machinery based on partial differential equations very similar to the Maxwell's equations in electrostatic theory. We show that in order to minimize the cost, the routes should be found based on the solution of these partial differential equations. In our formulation, the sensors are sources of information, and they are similar to the positive charges in electrostatics, the destinations are sinks of information and they are similar to negative charges, and the network is similar to a non-homogeneous dielectric media with variable dielectric constant (or permittivity coefficient). In one of the applications of our mathematical model based on the vector fields, we offer a scheme for energy efficient routing. Our routing scheme is based on changing the permittivity coefficient to a higher value in the places of the network where nodes have high residual energy, and setting it to a low value in the places of the network where the nodes do not have much energy left. Our simulations show that our method gives a significant increase in the network life compared to the shortest path and weighted shortest path schemes. Our initial focus is on the case where there is only one destination in the network, and later we extend our approach to the case where there are multiple destinations in the network. In the case of having multiple destinations, we need to partition the network into several areas known as regions of attraction of the destinations. Each destination is responsible for collecting all messages being generated in its region of attraction. The complexity of the optimization problem in this case is how to define regions of attraction for the destinations and how much communication load to assign to each destination to optimize the performance of the network. We use our vector field model to solve the optimization problem for this case. We define a vector field, which is conservative, and hence it can be written as the gradient of a scalar field (also known as a potential field). Then we show that in the optimal assignment of the communication load of the network to the destinations, the value of that potential field should be equal at the locations of all the destinations. Another application of our vector field model is to find the optimal locations of the destinations in the network. We show that the vector field gives the gradient of the cost function with respect to the locations of the destinations. Based on this fact, we suggest an algorithm to be applied during the design phase of a network to relocate the destinations for reducing the communication cost function. The performance of our proposed schemes is confirmed by several examples and simulation experiments. In another part of this work we focus on the notions of responsiveness and conformance of TCP traffic in communication networks. We introduce the notion of responsiveness for TCP aggregates and define it as the degree to which a TCP aggregate reduces its sending rate to the network as a response to packet drops. We define metrics that describe the responsiveness of TCP aggregates, and suggest two methods for determining the values of these quantities. The first method is based on a test in which we drop a few packets from the aggregate intentionally and measure the resulting rate decrease of that aggregate. This kind of test is not robust to multiple simultaneous tests performed at different routers. We make the test robust to multiple simultaneous tests by using ideas from the CDMA approach to multiple access channels in communication theory. Based on this approach, we introduce tests of responsiveness for aggregates, and call it CDMA based Aggregate Perturbation Method (CAPM). We use CAPM to perform congestion control. A distinguishing feature of our congestion control scheme is that it maintains a degree of fairness among different aggregates. In the next step we modify CAPM to offer methods for estimating the proportion of an aggregate of TCP traffic that does not conform to protocol specifications, and hence may belong to a DDoS attack. Our methods work by intentionally perturbing the aggregate by dropping a very small number of packets from it and observing the response of the aggregate. We offer two methods for conformance testing. In the first method, we apply the perturbation tests to SYN packets being sent at the start of the TCP 3-way handshake, and we use the fact that the rate of ACK packets being exchanged in the handshake should follow the rate of perturbations. In the second method, we apply the perturbation tests to the TCP data packets and use the fact that the rate of retransmitted data packets should follow the rate of perturbations. In both methods, we use signature based perturbations, which means packet drops are performed with a rate given by a function of time. We use analogy of our problem with multiple access communication to find signatures. Specifically, we assign orthogonal CDMA based signatures to different routers in a distributed implementation of our methods. As a result of orthogonality, the performance does not degrade because of cross interference made by simultaneously testing routers. We have shown efficacy of our methods through mathematical analysis and extensive simulation experiments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The intrinsic independent features of the optimal codebook cubes searching process in fractal video compression systems are examined and exploited. The design of a suitable parallel algorithm reflecting the concept is presented. The Message Passing Interface (MPI) is chosen to be the communication tool for the implementation of the parallel algorithm on distributed memory parallel computers. Experimental results show that the parallel algorithm is able to reduce the compression time and achieve a high speed-up without changing the compression ratio and the quality of the decompressed image. A scalability test was also performed, and the results show that this parallel algorithm is scalable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Supervisory systems evolution makes the obtaining of significant information from processes more important in the way that the supervision systems' particular tasks are simplified. So, having signal treatment tools capable of obtaining elaborate information from the process data is important. In this paper, a tool that obtains qualitative data about the trends and oscillation of signals is presented. An application of this tool is presented as well. In this case, the tool, implemented in a computer-aided control systems design (CACSD) environment, is used in order to give to an expert system for fault detection in a laboratory plant

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the first questions to consider when designing a new roll forming line is the number of forming steps required to produce a profile. The number depends on material properties, the cross-section geometry and tolerance requirements, but the tool designer also wants to minimize the number of forming steps in order to reduce the investment costs for the customer. There are several computer aided engineering systems on the market that can assist the tool designing process. These include more or less simple formulas to predict deformation during forming as well as the number of forming steps. In recent years it has also become possible to use finite element analysis for the design of roll forming processes. The objective of the work presented in this thesis was to answer the following question: How should the roll forming process be designed for complex geometries and/or high strength steels? The work approach included both literature studies as well as experimental and modelling work. The experimental part gave direct insight into the process and was also used to develop and validate models of the process. Starting with simple geometries and standard steels the work progressed to more complex profiles of variable depth and width, made of high strength steels. The results obtained are published in seven papers appended to this thesis. In the first study (see paper 1) a finite element model for investigating the roll forming of a U-profile was built. It was used to investigate the effect on longitudinal peak membrane strain and deformation length when yield strength increases, see paper 2 and 3. The simulations showed that the peak strain decreases whereas the deformation length increases when the yield strength increases. The studies described in paper 4 and 5 measured roll load, roll torque, springback and strain history during the U-profile forming process. The measurement results were used to validate the finite element model in paper 1. The results presented in paper 6 shows that the formability of stainless steel (e.g. AISI 301), that in the cold rolled condition has a large martensite fraction, can be substantially increased by heating the bending zone. The heated area will then become austenitic and ductile before the roll forming. Thanks to the phenomenon of strain induced martensite formation, the steel will regain the martensite content and its strength during the subsequent plastic straining. Finally, a new tooling concept for profiles with variable cross-sections is presented in paper 7. The overall conclusions of the present work are that today, it is possible to successfully develop profiles of complex geometries (3D roll forming) in high strength steels and that finite element simulation can be a useful tool in the design of the roll forming process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper explores non-deterministic parametric modelling as a design tool. Specifically, it addresses the application of parametric variables to the generation of a conceptual bridge design and the use of repeatable discrete components to the conceptual form. In order to control the generation of the bridge form, a set of design variables based on the concept of a law curve have been developed.These design variables are applied and tested through interactive modelling and variation, driven by manipulating the law curve. Combining this process with the application and control of a repeatable element, known as a Representative Volumetric Element (RVE), allows for the development and exploration of a design solution that could not be achieved through the use of conventional computer modelling.The competition brief for the Australian Institute of Architects (AIA) ‘Dialectical Bridge’ has been used as a case study to demonstrate the use of non-deterministic parametric modelling as a design tool.The results of the experimentation with parametric variables, the law curve and representative volumetric elements (RVE) are presented in the paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reports on three approaches to the translation of Gaussian surface models into scaled physical prototype models. Using the geometry of Eladio Dieste's Gaussian Vaults, the paper reports on the aspects encountered in the process of digital to physical prototype fabrication. The primary focus of the paper is on exploring the design geometry, investigating methods for preparing the geometry for fabrication and constructing physical prototypes. Three different approaches in the translation from digital to physical models are investigated: rapid prototyping, two dimensional surface models in paper and structural component models using Computer Numerical Controlled (CNC) fabrication. The three approaches identify a body of knowledge in the design and prototyping of Gaussian vaults. Finally the paper discusses the digital to. fabrication translation processes with regards to the characteristics, benefits and limitations of the three approaches of prototyping the ruled surface geometry of Gaussian Vaults.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Online communications, multimedia, mobile computing and face-to-face learning create blended learning environments to which some Virtual Design Studios (VDS) have reacted to. Social Networks (SN), as instruments for communication, have provided a potentially fruitful operative base for VDS. These technologies transfer communication, leadership, democratic interaction, teamwork, social engagement and responsibility away from the design tutors to the participants. The implementation of Social Network VDS (SNVDS) moved the VDS beyond its conventional realm and enabled students to develop architectural design that is embedded into a community of learners and expertise both online and offline. Problem-based learning (PBL) becomes an iterative and reflexive process facilitating deep learning. The paper discusses details of the SNVDS, its pedagogical implications to PBL, and presents how the SNVDS is successful in enabling architectural students to collaborate and communicate design proposals that integrate a variety of skills, deep learning, knowledge and construction with a rich learning experience.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the advent of social networks, it became apparent that the social aspect of designing and learning plays a crucial role in students’ education. Technologies and skills are the base on which learners interact. The ease of communication, leadership opportunity, democratic interaction, teamwork, and the sense of community are some of the aspects that are now in the centre of design interaction. The paper examines Virtual Design Studios (VDS) that used media-rich platforms and analyses the influence the social aspect plays in solving all problems on the sample of a design studio at Deakin University. It studies the effectiveness of the generated social intelligence and explores the facilitation of students’ self-directed learning. Hereby the paper studies the construction of knowledge via social interaction and how blended learning environments foster motivation and information exchange. It presents its finding based on VDS that were held over the past three years.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Representational media-analogue, physical, digital, or virtual-are employed by students in the conception, development and presentation. In 2013 a survey at two architectural schools was conducted to study the current representational media use in design studios. The survey examined the role digital and physical media play in students' design work and how students use the various media to generate and communicate their designs. This study presents its importance through the shift in architectural education whereby digital tools are not taught per se any longer, however expected to be mastered throughout the course. Yet students' learning experiences are strongly dependant on the successful acquisition of skills and its transfer to deep learning. Especially architectural design studios build upon the premises that re-representation leads to a better acquisition of knowledge. Architectural educators may use the study to revisit their studio and reposition the role of media as well as align learning outcomes, deliverables and communication tools with the actual workingand learning-styles of students. © 2014, The Association for Computer-Aided Architectural Design Research in Asia (CAADRIA), Hong Kong.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of technologies called computedassisted, such as CAD - (Computed Aided Design), CAM - (Computed Aided Manufacturing) and CNC - (Computed Numerical Control), increasingly demanded by the market, are needed in the teaching of subjects technical drawing and design courses for engineering and design. However its use findl barriers in the more conservative wing of the academy, who advocate the use of traditional drawing, for the settling of the concepts and the development of spatial reasoning. This study aimed to show the results obtained with the design and production of an apparatus for measuring a three-dimensional computer-aided milling machine, interaction, integration and consolidation of concepts, fully demonstrating that the learning of computer-assisted technology is possible, and its use is most appropriate, meaningful and productive, than the use of instruments in the classic design.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Drug discovery has moved toward more rational strategies based on our increasing understanding of the fundamental principles of protein-ligand interactions. Structure( SBDD) and ligand-based drug design (LBDD) approaches bring together the most powerful concepts in modern chemistry and biology, linking medicinal chemistry with structural biology. The definition and assessment of both chemical and biological space have revitalized the importance of exploring the intrinsic complementary nature of experimental and computational methods in drug design. Major challenges in this field include the identification of promising hits and the development of high-quality leads for further development into clinical candidates. It becomes particularly important in the case of neglected tropical diseases (NTDs) that affect disproportionately poor people living in rural and remote regions worldwide, and for which there is an insufficient number of new chemical entities being evaluated owing to the lack of innovation and R&D investment by the pharmaceutical industry. This perspective paper outlines the utility and applications of SBDD and LBDD approaches for the identification and design of new small-molecule agents for NTDs.