42 resultados para Computer aided network analysis
Resumo:
Clashes occur when components in an assembly unintentionally violate others. If clashes are not identified and designed out before manufacture, product function will be reduced or substantial cost will be incurred in rework. This paper introduces a novel approach for eliminating clashes by identifying which parameters defining the part features in a computer aided design (CAD) assembly need to change and by how much. Sensitivities are calculated for each parameter defining the part and the assembly as the change in clash volume due to a change in each parameter value. These sensitivities give an indication of important parameters and are used to predict the optimum combination of changes in each parameter to eliminate the clash. Consideration is given to the fact that it is sometimes preferable to modify some components in an assembly rather than others and that some components in an assembly cannot be modified as the designer does not have control over their shape. Successful elimination of clashes has been demonstrated in a number of example assemblies.
Resumo:
Timely and individualized feedback on coursework is desirable from a student perspective as it facilitates formative development and encourages reflective learning practice. Faculty however are faced with a significant and potentially time consuming challenge when teaching larger cohorts if they are to provide feedback which is timely, individualized and detailed. Additionally, for subjects which assess non-traditional submissions, such as Computer-Aided-Design (CAD), the methods for assessment and feedback tend not to be so well developed or optimized. Issues can also arise over the consistency of the feedback provided. Evaluations of Computer-Assisted feedback in other disciplines (Denton et al, 2008), (Croft et al, 2001) have shown students prefer this method of feedback to traditional “red pen” marking and also that such methods can be more time efficient for faculty.
Herein, approaches are described which make use of technology and additional software tools to speed up, simplify and automate assessment and the provision of feedback for large cohorts of first and second year engineering students studying modules where CAD files are submitted electronically. A range of automated methods are described and compared with more “manual” approaches. Specifically one method uses an application programming interface (API) to interrogate SolidWorks models and extract information into an Excel spreadsheet, which is then used to automatically send feedback emails. Another method describes the use of audio recordings made during model interrogation which reduces the amount of time while increasing the level of detail provided as feedback.
Limitations found with these methods and problems encountered are discussed along with a quantified assessment of time saving efficiencies made.
Resumo:
Advances in stem cell science and tissue engineering are being turned into applications and products through a novel medical paradigm known as regenerative medicine. This paper begins by examining the vulnerabilities and risks encountered by the regenerative medicine industry during a pivotal moment in its scientific infancy: the 2000s. Under the auspices of New Labour, British medical scientists and life science innovation firms associated with regenerative medicine, received demonstrative rhetorical pledges of support, aligned with the publication of a number of government initiated reports presaged by Bioscience 2015: Improving National Health, Increasing National Wealth. The Department of Health and the Department of Trade and Industry (and its successors) held industry consultations to determine the best means by which innovative bioscience cultures might be promoted and sustained in Britain. Bioscience 2015 encapsulates the first chapter of this sustainability narrative. By 2009, the tone of this storyline had changed to one of survivability. In the second part of the paper, we explore the ministerial interpretation of the ‘bioscience discussion cycle’ that embodies this narrative of expectation, using a computer-aided content analysis programme. Our analysis notes that the ministerial interpretation of these reports has continued to place key emphasis upon the distinctive and exceptional characteristics of the life science industries, such as their ability to perpetuate innovations in regenerative medicine and the optimism this portends – even though many of the economic expectations associated with this industry have remained unfulfilled.
Resumo:
There is a requirement for better integration between design and analysis tools, which is difficult due to their different objectives, separate data representations and workflows. Currently, substantial effort is required to produce a suitable analysis model from design geometry. Robust links are required between these different representations to enable analysis attributes to be transferred between different design and analysis packages for models at various levels of fidelity.
This paper describes a novel approach for integrating design and analysis models by identifying and managing the relationships between the different representations. Three key technologies, Cellular Modeling, Virtual Topology and Equivalencing, have been employed to achieve effective simulation model management. These technologies and their implementation are discussed in detail. Prototype automated tools are introduced demonstrating how multiple simulation models can be linked and maintained to facilitate seamless integration throughout the design cycle.
Resumo:
This paper outlines the importance of robust interface management for facilitating finite element analysis workflows. Topological equivalences between analysis model representations are identified and maintained in an editable and accessible manner. The model and its interfaces are automatically represented using an analysis-specific cellular decomposition of the design space. Rework of boundary conditions following changes to the design geometry or the analysis idealization can be minimized by tracking interface dependencies. Utilizing this information with the Simulation Intent specified by an analyst, automated decisions can be made to process the interface information required to rebuild analysis models. Through this work automated boundary condition application is realized within multi-component, multi-resolution and multi-fidelity analysis workflows.
Resumo:
PURPOSE: This systematic review aimed to report and explore the survival of dental veneers constructed from non-feldspathic porcelain over 5 and 10 years.
MATERIALS AND METHODS: A total of 4,294 articles were identified through a systematic search involving all databases in the Cochrane Library, MEDLINE (OVID), EMBASE, Web of Knowledge, specific journals (hand-search), conference proceedings, clinical trials registers, and collegiate contacts. Articles, abstracts, and gray literature were sought by two independent researchers. There were no language limitations. One hundred sixteen studies were identified for full-text assessment, with 10 included in the analysis (5 qualitative, 5 quantitative). Study characteristics and survival (Kaplan-Meier estimated cumulative survival and 95% confidence interval [CI]) were extracted or recalculated. A failed veneer was one which required an intervention that disrupted the original marginal integrity, had been partially or completely lost, or had lost retention more than twice. A meta-analysis and sensitivity analysis of Empress veneers was completed, with an assessment of statistical heterogeneity and publication bias. Clinical heterogeneity was explored for results of all veneering materials from included studies.
RESULTS: Within the 10 studies, veneers were fabricated with IPS Empress, IPS Empress 2, Cerinate, and Cerec computer-aided design/computer-assisted manufacture (CAD/CAM) materials VITA Mark I, VITA Mark II, Ivoclar ProCad. The meta-analysis showed the pooled estimate for Empress veneers to be 92.4% (95% CI: 89.8% to 95.0%) for 5-year survival and 66% to 94% (95% CI: 55% to 99%) for 10 years. Data regarding other non-feldspathic porcelain materials were lacking, with only a single study each reporting outcomes for Empress 2, Cerinate, and various Cerec porcelains over 5 years. The sensitivity analysis showed data from one study had an influencing and stabilizing effect on the 5-year pooled estimate.
CONCLUSION: The long-term outcome (> 5 years) of non-feldspathic porcelain veneers is sparsely reported in the literature. This systematic review indicates that the 5-year cumulative estimated survival for etchable non-feldspathic porcelain veneers is over 90%. Outcomes may prove clinically acceptable with time, but evidence remains lacking and the use of these materials for veneers remains experimental.
Resumo:
Virtual topology operations have been utilized to generate an analysis topology definition suitable for downstream mesh generation. Detailed descriptions are provided for virtual topology merge and split operations for all topological entities. Current virtual topology technology is extended to allow the virtual partitioning of volume cells and the topological queries required to carry out each operation are provided. Virtual representations are robustly linked to the underlying geometric definition through an analysis topology. The analysis topology and all associated virtual and topological dependencies are automatically updated after each virtual operation, providing the link to the underlying CAD geometry. Therefore, a valid description of the analysis topology, including relative orientations, is maintained. This enables downstream operations, such as the merging or partitioning of virtual entities, and interrogations, such as determining if a specific meshing strategy can be applied to the virtual volume cells, to be performed on the analysis topology description. As the virtual representation is a non-manifold description of the sub-divided domain the interfaces between cells are recorded automatically. This enables the advantages of non-manifold modelling to be exploited within the manifold modelling environment of a major commercial CAD system, without any adaptation of the underlying CAD model. A hierarchical virtual structure is maintained where virtual entities are merged or partitioned. This has a major benefit over existing solutions as the virtual dependencies are stored in an open and accessible manner, providing the analyst with the freedom to create, modify and edit the analysis topology in any preferred sequence, whilst the original CAD geometry is not disturbed. Robust definitions of the topological and virtual dependencies enable the same virtual topology definitions to be accessed, interrogated and manipulated within multiple different CAD packages and linked to the underlying geometry.
Resumo:
In this paper, the compression of multispectral images is addressed. Such 3-D data are characterized by a high correlation across the spectral components. The efficiency of the state-of-the-art wavelet-based coder 3-D SPIHT is considered. Although the 3-D SPIHT algorithm provides the obvious way to process a multispectral image as a volumetric block and, consequently, maintain the attractive properties exhibited in 2-D (excellent performance, low complexity, and embeddedness of the bit-stream), its 3-D trees structure is shown to be not adequately suited for 3-D wavelet transformed (DWT) multispectral images. The fact that each parent has eight children in the 3-D structure considerably increases the list of insignificant sets (LIS) and the list of insignificant pixels (LIP) since the partitioning of any set produces eight subsets which will be processed similarly during the sorting pass. Thus, a significant portion from the overall bit-budget is wastedly spent to sort insignificant information. Through an investigation based on results analysis, we demonstrate that a straightforward 2-D SPIHT technique, when suitably adjusted to maintain the rate scalability and carried out in the 3-D DWT domain, overcomes this weakness. In addition, a new SPIHT-based scalable multispectral image compression algorithm is used in the initial iterations to exploit the redundancies within each group of two consecutive spectral bands. Numerical experiments on a number of multispectral images have shown that the proposed scheme provides significant improvements over related works.
Resumo:
Improving performance in sports requires a better understanding of the perception-action loop employed by athletes. Because of its inherent limitations, video playback doesn't permit this type of in-depth analysis. Interactive, immersive virtual reality can overcome these limitations and foster a better understanding of sports performance.
Resumo:
The motivation for this paper is to present procedures for automatically creating idealised finite element models from the 3D CAD solid geometry of a component. The procedures produce an accurate and efficient analysis model with little effort on the part of the user. The technique is applicable to thin walled components with local complex features and automatically creates analysis models where 3D elements representing the complex regions in the component are embedded in an efficient shell mesh representing the mid-faces of the thin sheet regions. As the resulting models contain elements of more than one dimension, they are referred to as mixed dimensional models. Although these models are computationally more expensive than some of the idealisation techniques currently employed in industry, they do allow the structural behaviour of the model to be analysed more accurately, which is essential if appropriate design decisions are to be made. Also, using these procedures, analysis models can be created automatically whereas the current idealisation techniques are mostly manual, have long preparation times, and are based on engineering judgement. In the paper the idealisation approach is first applied to 2D models that are used to approximate axisymmetric components for analysis. For these models 2D elements representing the complex regions are embedded in a 1D mesh representing the midline of the cross section of the thin sheet regions. Also discussed is the coupling, which is necessary to link the elements of different dimensionality together. Analysis results from a 3D mixed dimensional model created using the techniques in this paper are compared to those from a stiffened shell model and a 3D solid model to demonstrate the improved accuracy of the new approach. At the end of the paper a quantitative analysis of the reduction in computational cost due to shell meshing thin sheet regions demonstrates that the reduction in degrees of freedom is proportional to the square of the aspect ratio of the region, and for long slender solids, the reduction can be proportional to the aspect ratio of the region if appropriate meshing algorithms are used.
Resumo:
While the incorporation of mathematical and engineering methods has greatly advanced in other areas of the life sciences, they have been under-utilized in the field of animal welfare. Exceptions are beginning to emerge and share a common motivation to quantify 'hidden' aspects in the structure of the behaviour of an individual, or group of animals. Such analyses have the potential to quantify behavioural markers of pain and stress and quantify abnormal behaviour objectively. This review seeks to explore the scope of such analytical methods as behavioural indicators of welfare. We outline four classes of analyses that can be used to quantify aspects of behavioural organization. The underlying principles, possible applications and limitations are described for: fractal analysis, temporal methods, social network analysis, and agent-based modelling and simulation. We hope to encourage further application of analyses of behavioural organization by highlighting potential applications in the assessment of animal welfare, and increasing awareness of the scope for the development of new mathematical methods in this area.