982 resultados para virtual topology, decomposition, hex meshing algorithms


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Virtual topology operations have been utilized to generate an analysis topology definition suitable for downstream mesh generation. Detailed descriptions are provided for virtual topology merge and split operations for all topological entities, where virtual decompositions are robustly linked to the underlying geometry. Current virtual topology technology is extended to allow the virtual partitioning of volume cells. A valid description of the topology, including relative orientations, is maintained which enables downstream interrogations to be performed on the analysis topology description, such as determining if a specific meshing strategy can be applied to the virtual volume cells. As the virtual representation is a true non-manifold description of the sub-divided domain the interfaces between cells are recorded automatically. Therefore, the advantages of non-manifold modelling are exploited within the manifold modelling environment of a major commercial CAD system without any adaptation of the underlying CAD model. A hierarchical virtual structure is maintained where virtual entities are merged or partitioned. This has a major benefit over existing solutions as the virtual dependencies here are stored in an open and accessible manner, providing the analyst with the freedom to create, modify and edit the analysis topology in any preferred sequence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Virtual topology operations have been utilized to generate an analysis topology definition suitable for downstream mesh generation. Detailed descriptions are provided for virtual topology merge and split operations for all topological entities. Current virtual topology technology is extended to allow the virtual partitioning of volume cells and the topological queries required to carry out each operation are provided. Virtual representations are robustly linked to the underlying geometric definition through an analysis topology. The analysis topology and all associated virtual and topological dependencies are automatically updated after each virtual operation, providing the link to the underlying CAD geometry. Therefore, a valid description of the analysis topology, including relative orientations, is maintained. This enables downstream operations, such as the merging or partitioning of virtual entities, and interrogations, such as determining if a specific meshing strategy can be applied to the virtual volume cells, to be performed on the analysis topology description. As the virtual representation is a non-manifold description of the sub-divided domain the interfaces between cells are recorded automatically. This enables the advantages of non-manifold modelling to be exploited within the manifold modelling environment of a major commercial CAD system, without any adaptation of the underlying CAD model. A hierarchical virtual structure is maintained where virtual entities are merged or partitioned. This has a major benefit over existing solutions as the virtual dependencies are stored in an open and accessible manner, providing the analyst with the freedom to create, modify and edit the analysis topology in any preferred sequence, whilst the original CAD geometry is not disturbed. Robust definitions of the topological and virtual dependencies enable the same virtual topology definitions to be accessed, interrogated and manipulated within multiple different CAD packages and linked to the underlying geometry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The bandwidth requirements of the Internet are increasing every day and there are newer and more bandwidth-thirsty applications emerging on the horizon. Wavelength division multiplexing (WDM) is the next step towards leveraging the capabilities of the optical fiber, especially for wide-area backbone networks. The ability to switch a signal at intermediate nodes in a WDM network based on their wavelengths is known as wavelength-routing. One of the greatest advantages of using wavelength-routing WDM is the ability to create a virtual topology different from the physical topology of the underlying network. This virtual topology can be reconfigured when necessary, to improve performance. We discuss the previous work done on virtual topology design and also discuss and propose different reconfiguration algorithms applicable under different scenarios.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An analytical model for Virtual Topology Reconfiguration (VTR) in optical networks is developed. It aims at the optical networks with a circuit-based data plane and an IPlike control plane. By identifying and analyzing the important factors impacting the network performance due to VTR operations on both planes, we can compare the benefits and penalties of different VTR algorithms and policies. The best VTR scenario can be adaptively chosen from a set of such algorithms and policies according to the real-time network situations. For this purpose, a cost model integrating all these factors is created to provide a comparison criterion independent of any specific VTR algorithm and policy. A case study based on simulation experiments is conducted to illustrate the application of our models.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Network survivability is one of the most important issues in the design of optical WDM networks. In this work we study the problem of survivable routing of a virtual topology on a physical topology with Shared Risk Link Groups (SRLG). The survivable virtual topology routing problem against single-link failures in the physical topology is proved to be NP-complete in [1]. We prove that survivable virtual topology routing problem against SRLG/node failures is also NP-complete. We present an improved integer linear programming (ILP) formulation (in comparison to [1]) for computing the survivable routing under SRLG/node failures. Using an ILP solver, we computed the survivable virtual topology routing against link and SRLG failures for small and medium sized networks efficiently. As even our improved ILP formulation becomes intractable for large networks, we present a congestion-based heuristic and a tabu search heuristic (which uses the congestion-based heuristic solution as the initial solution) for computing survivable routing of a virtual topology. Our experimental results show that tabu search heuristic coupled with the congestion based heuristic (used as initial solution) provides fast and near-optimal solutions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we exploit the idea of decomposition to match buyers and sellers in an electronic exchange for trading large volumes of homogeneous goods, where the buyers and sellers specify marginal-decreasing piecewise constant price curves to capture volume discounts. Such exchanges are relevant for automated trading in many e-business applications. The problem of determining winners and Vickrey prices in such exchanges is known to have a worst-case complexity equal to that of as many as (1 + m + n) NP-hard problems, where m is the number of buyers and n is the number of sellers. Our method proposes the overall exchange problem to be solved as two separate and simpler problems: 1) forward auction and 2) reverse auction, which turns out to be generalized knapsack problems. In the proposed approach, we first determine the quantity of units to be traded between the sellers and the buyers using fast heuristics developed by us. Next, we solve a forward auction and a reverse auction using fully polynomial time approximation schemes available in the literature. The proposed approach has worst-case polynomial time complexity. and our experimentation shows that the approach produces good quality solutions to the problem. Note to Practitioners- In recent times, electronic marketplaces have provided an efficient way for businesses and consumers to trade goods and services. The use of innovative mechanisms and algorithms has made it possible to improve the efficiency of electronic marketplaces by enabling optimization of revenues for the marketplace and of utilities for the buyers and sellers. In this paper, we look at single-item, multiunit electronic exchanges. These are electronic marketplaces where buyers submit bids and sellers ask for multiple units of a single item. We allow buyers and sellers to specify volume discounts using suitable functions. Such exchanges are relevant for high-volume business-to-business trading of standard products, such as silicon wafers, very large-scale integrated chips, desktops, telecommunications equipment, commoditized goods, etc. The problem of determining winners and prices in such exchanges is known to involve solving many NP-hard problems. Our paper exploits the familiar idea of decomposition, uses certain algorithms from the literature, and develops two fast heuristics to solve the problem in a near optimal way in worst-case polynomial time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The finite element method plays an extremely important role in forging process design as it provides a valid means to quantify forging errors and thereby govern die shape modification to improve the dimensional accuracy of the component. However, this dependency on process simulation could raise significant problems and present a major drawback if the finite element simulation results were inaccurate. This paper presents a novel approach to assess the dimensional accuracy and shape quality of aeroengine blades formed from finite element hot-forging simulation. The proposed virtual inspection system uses conventional algorithms adopted by modern coordinate measurement processes as well as the latest free-form surface evaluation techniques to provide a robust framework for virtual forging error assessment. Established techniques for the physical registration of real components have been adapted to localise virtual models in relation to a nominal Design Coordinate System. Blades are then automatically analysed using a series of intelligent routines to generate measurement data and compute dimensional errors. The results of a comparison study indicate that the virtual inspection results and actual coordinate measurement data are highly comparable, validating the approach as an effective and accurate means to quantify forging error in a virtual environment. Consequently, this provides adequate justification for the implementation of the virtual inspection system in the virtual process design, modelling and validation of forged aeroengine blades in industry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Integrating analysis and design models is a complex task due to differences between the models and the architectures of the toolsets used to create them. This complexity is increased with the use of many different tools for specific tasks during an analysis process. In this work various design and analysis models are linked throughout the design lifecycle, allowing them to be moved between packages in a way not currently available. Three technologies named Cellular Modeling, Virtual Topology and Equivalencing are combined to demonstrate how different finite element meshes generated on abstract analysis geometries can be linked to their original geometry. Establishing the equivalence relationships between models enables analysts to utilize multiple packages for specialist tasks without worrying about compatibility issues or rework.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

New techniques are presented for using the medial axis to generate decompositions on which high quality block-structured meshes with well-placed mesh singularities can be generated. Established medial-axis-based meshing algorithms are effective for some geometries, but in general, they do not produce the most favourable decompositions, particularly when there are geometric concavities. This new approach uses both the topological and geometric information in the medial axis to establish a valid and effective arrangement of mesh singularities for any 2-D surface. It deals with concavities effectively and finds solutions that are most appropriate to the geometric shapes. Resulting meshes are shown for a number of example models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The automatic generation of structured multi-block quadrilateral (quad) and hexahedral (hex) meshes has been researched for many years without definitive success. The core problem in quad / hex mesh generation is the placement of mesh singularities to give the desired mesh orientation and distribution [1]. It is argued herein that existing approaches (medial axis, paving / plastering, cross / frame fields) are actually alternative views of the same concept. Using the information provided by the different approaches provides additional insight into the problem.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The diffusion equation-based modeling of near infrared light propagation in tissue is achieved by using finite-element mesh for imaging real-tissue types, such as breast and brain. The finite-element mesh size (number of nodes) dictates the parameter space in the optical tomographic imaging. Most commonly used finite-element meshing algorithms do not provide the flexibility of distinct nodal spacing in different regions of imaging domain to take the sensitivity of the problem into consideration. This study aims to present a computationally efficient mesh simplification method that can be used as a preprocessing step to iterative image reconstruction, where the finite-element mesh is simplified by using an edge collapsing algorithm to reduce the parameter space at regions where the sensitivity of the problem is relatively low. It is shown, using simulations and experimental phantom data for simple meshes/domains, that a significant reduction in parameter space could be achieved without compromising on the reconstructed image quality. The maximum errors observed by using the simplified meshes were less than 0.27% in the forward problem and 5% for inverse problem.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The motivation for this paper is to present procedures for automatically creating idealised finite element models from the 3D CAD solid geometry of a component. The procedures produce an accurate and efficient analysis model with little effort on the part of the user. The technique is applicable to thin walled components with local complex features and automatically creates analysis models where 3D elements representing the complex regions in the component are embedded in an efficient shell mesh representing the mid-faces of the thin sheet regions. As the resulting models contain elements of more than one dimension, they are referred to as mixed dimensional models. Although these models are computationally more expensive than some of the idealisation techniques currently employed in industry, they do allow the structural behaviour of the model to be analysed more accurately, which is essential if appropriate design decisions are to be made. Also, using these procedures, analysis models can be created automatically whereas the current idealisation techniques are mostly manual, have long preparation times, and are based on engineering judgement. In the paper the idealisation approach is first applied to 2D models that are used to approximate axisymmetric components for analysis. For these models 2D elements representing the complex regions are embedded in a 1D mesh representing the midline of the cross section of the thin sheet regions. Also discussed is the coupling, which is necessary to link the elements of different dimensionality together. Analysis results from a 3D mixed dimensional model created using the techniques in this paper are compared to those from a stiffened shell model and a 3D solid model to demonstrate the improved accuracy of the new approach. At the end of the paper a quantitative analysis of the reduction in computational cost due to shell meshing thin sheet regions demonstrates that the reduction in degrees of freedom is proportional to the square of the aspect ratio of the region, and for long slender solids, the reduction can be proportional to the aspect ratio of the region if appropriate meshing algorithms are used.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is a requirement for better integration between design and analysis tools, which is difficult due to their different objectives, separate data representations and workflows. Currently, substantial effort is required to produce a suitable analysis model from design geometry. Robust links are required between these different representations to enable analysis attributes to be transferred between different design and analysis packages for models at various levels of fidelity.

This paper describes a novel approach for integrating design and analysis models by identifying and managing the relationships between the different representations. Three key technologies, Cellular Modeling, Virtual Topology and Equivalencing, have been employed to achieve effective simulation model management. These technologies and their implementation are discussed in detail. Prototype automated tools are introduced demonstrating how multiple simulation models can be linked and maintained to facilitate seamless integration throughout the design cycle.