1000 resultados para multi-cam


Relevância:

40.00% 40.00%

Publicador:

Resumo:

O presente trabalho aborda a construção de protótipos utilizando um centro de maquinagem CNC multi-eixos. É feita também uma abordagem ao novo formato que se encontra em desenvolvimento para a transferência de dados entre os softwares de CAD/CAM e os controladores CNC. Este novo formato de dados conhecido por STEP-NC tem como objectivo principal a comunicação directa entre os controladores CNC e os softwares de CAD/CAM, o que actualmente não é possível, uma vez que existe a necessidade de recorrer sempre a um pós-processador de dados para converter as trajectórias de maquinagem criadas pelos softwares de CAD/CAM em linguagem ISO específica para cada controlador CNC. A inovação introduzida com a elaboração deste trabalho consiste na obtenção de protótipos num centro de maquinagem CNC utilizando um único aperto do bruto de maquinagem. Para isso recorreu-se à fixação do bruto de maquinagem directamente à bucha do eixo rotativo (4º Eixo), permitindo assim que este rode quando existe a necessidade de maquinar uma outra face do protótipo, evitando desta forma a criação de novas fixações e a presença de um operador junto da máquina. A utilização deste novo método permite-nos ainda uma grande aproximação entre os centros de maquinagem CNC e as máquinas de prototipagem rápida (por adição de camadas) dado que em ambas se podem obter protótipos sem a intervenção do operador durante a totalidade da sua construção. Os testes práticos foram realizados no Instituto Superior de Engenharia de Lisboa (ISEL), e a peça utilizada como protótipo para a realização dos ensaios foi facultada por uma empresa especializada na construção de componentes aeronáuticos (LAUAK Portuguesa).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Objective. To determine the influence of cement thickness and ceramic/cement bonding on stresses and failure of CAD/CAM crowns, using both multi-physics finite element analysis and monotonic testing.Methods. Axially symmetric FEA models were created for stress analysis of a stylized monolithic crown having resin cement thicknesses from 50 to 500 mu m under occlusal loading. Ceramic-cement interface was modeled as bonded or not-bonded (cement-dentin as bonded). Cement polymerization shrinkage was simulated as a thermal contraction. Loads necessary to reach stresses for radial cracking from the intaglio surface were calculated by FEA. Experimentally, feldspathic CAD/CAM crowns based on the FEA model were machined having different occlusal cementation spaces, etched and cemented to dentin analogs. Non-bonding of etched ceramic was achieved using a thin layer of poly(dimethylsiloxane). Crowns were loaded to failure at 5 N/s, with radial cracks detected acoustically.Results. Failure loads depended on the bonding condition and the cement thickness for both FEA and physical testing. Average fracture loads for bonded crowns were: 673.5 N at 50 mu m cement and 300.6 N at 500 mu m. FEA stresses due to polymerization shrinkage increased with the cement thickness overwhelming the protective effect of bonding, as was also seen experimentally. At 50 mu m cement thickness, bonded crowns withstood at least twice the load before failure than non-bonded crowns.Significance. Occlusal "fit" can have structural implications for CAD/CAM crowns; pre-cementation spaces around 50-100 mu m being recommended from this study. Bonding benefits were lost at thickness approaching 450-500 mu m due to polymerization shrinkage stresses. (C) 2012 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

En les xarxes IP/MPLS sobre WDM on es transporta gran quantitat d'informacio, la capacitat de garantir que el trafic arriba al node de desti ha esdevingut un problema important, ja que la fallada d'un element de la xarxa pot resultar en una gran quantitat d'informacio perduda. Per garantir que el trafic afectat per una fallada arribi al node desti, s'han definit nous algoritmes d'encaminament que incorporen el coneixement de la proteccio en els dues capes: l'optica (WDM) i la basada en paquets (IP/MPLS). D'aquesta manera s'evita reservar recursos per protegir el trafic a les dues capes. Els nous algoritmes resulten en millor us dels recursos de la xarxa, ofereixen rapid temps de recuperacio, eviten la duplicacio de recursos i disminueixen el numero de conversions del trafic de senyal optica a electrica.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Immediate loading of dental implants shortens the treatment time and makes it possible to give the patient an esthetic appearance throughout the treatment period. Placement of dental implants requires precise planning that accounts for anatomic limitations and restorative goals. Diagnosis can be made with the assistance of computerized tomographic scanning, but transfer of planning to the surgical field is limited. Recently, novel CAD/CAM techniques such as stereolithographic rapid prototyping have been developed to build surgical guides in an attempt to improve precision of implant placement. The aim of this case report was to show a modified surgical template used throughout implant placement as an alternative to a conventional surgical guide.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tuning compilations is the process of adjusting the values of a compiler options to improve some features of the final application. In this paper, a strategy based on the use of a genetic algorithm and a multi-objective scheme is proposed to deal with this task. Unlike previous works, we try to take advantage of the knowledge of this domain to provide a problem-specific genetic operation that improves both the speed of convergence and the quality of the results. The evaluation of the strategy is carried out by means of a case of study aimed to improve the performance of the well-known web server Apache. Experimental results show that a 7.5% of overall improvement can be achieved. Furthermore, the adaptive approach has shown an ability to markedly speed-up the convergence of the original strategy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tool path generation is one of the most complex problems in Computer Aided Manufacturing. Although some efficient strategies have been developed, most of them are only useful for standard machining. However, the algorithms used for tool path computation demand a higher computation performance, which makes the implementation on many existing systems very slow or even impractical. Hardware acceleration is an incremental solution that can be cleanly added to these systems while keeping everything else intact. It is completely transparent to the user. The cost is much lower and the development time is much shorter than replacing the computers by faster ones. This paper presents an optimisation that uses a specific graphic hardware approach using the power of multi-core Graphic Processing Units (GPUs) in order to improve the tool path computation. This improvement is applied on a highly accurate and robust tool path generation algorithm. The paper presents, as a case of study, a fully implemented algorithm used for turning lathe machining of shoe lasts. A comparative study will show the gain achieved in terms of total computing time. The execution time is almost two orders of magnitude faster than modern PCs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The design of fault tolerant systems is gaining importance in large domains of embedded applications where design constrains are as important as reliability. New software techniques, based on selective application of redundancy, have shown remarkable fault coverage with reduced costs and overheads. However, the large number of different solutions provided by these techniques, and the costly process to assess their reliability, make the design space exploration a very difficult and time-consuming task. This paper proposes the integration of a multi-objective optimization tool with a software hardening environment to perform an automatic design space exploration in the search for the best trade-offs between reliability, cost, and performance. The first tool is commanded by a genetic algorithm which can simultaneously fulfill many design goals thanks to the use of the NSGA-II multi-objective algorithm. The second is a compiler-based infrastructure that automatically produces selective protected (hardened) versions of the software and generates accurate overhead reports and fault coverage estimations. The advantages of our proposal are illustrated by means of a complex and detailed case study involving a typical embedded application, the AES (Advanced Encryption Standard).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern compilers present a great and ever increasing number of options which can modify the features and behavior of a compiled program. Many of these options are often wasted due to the required comprehensive knowledge about both the underlying architecture and the internal processes of the compiler. In this context, it is usual, not having a single design goal but a more complex set of objectives. In addition, the dependencies between different goals are difficult to be a priori inferred. This paper proposes a strategy for tuning the compilation of any given application. This is accomplished by using an automatic variation of the compilation options by means of multi-objective optimization and evolutionary computation commanded by the NSGA-II algorithm. This allows finding compilation options that simultaneously optimize different objectives. The advantages of our proposal are illustrated by means of a case study based on the well-known Apache web server. Our strategy has demonstrated an ability to find improvements up to 7.5% and up to 27% in context switches and L2 cache misses, respectively, and also discovers the most important bottlenecks involved in the application performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Feature selection is an important and active issue in clustering and classification problems. By choosing an adequate feature subset, a dataset dimensionality reduction is allowed, thus contributing to decreasing the classification computational complexity, and to improving the classifier performance by avoiding redundant or irrelevant features. Although feature selection can be formally defined as an optimisation problem with only one objective, that is, the classification accuracy obtained by using the selected feature subset, in recent years, some multi-objective approaches to this problem have been proposed. These either select features that not only improve the classification accuracy, but also the generalisation capability in case of supervised classifiers, or counterbalance the bias toward lower or higher numbers of features that present some methods used to validate the clustering/classification in case of unsupervised classifiers. The main contribution of this paper is a multi-objective approach for feature selection and its application to an unsupervised clustering procedure based on Growing Hierarchical Self-Organising Maps (GHSOMs) that includes a new method for unit labelling and efficient determination of the winning unit. In the network anomaly detection problem here considered, this multi-objective approach makes it possible not only to differentiate between normal and anomalous traffic but also among different anomalies. The efficiency of our proposals has been evaluated by using the well-known DARPA/NSL-KDD datasets that contain extracted features and labelled attacks from around 2 million connections. The selected feature sets computed in our experiments provide detection rates up to 99.8% with normal traffic and up to 99.6% with anomalous traffic, as well as accuracy values up to 99.12%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, a plethora of approaches have been proposed to deal with the increasingly challenging task of multi-output regression. This paper provides a survey on state-of-the-art multi-output regression methods, that are categorized as problem transformation and algorithm adaptation methods. In addition, we present the mostly used performance evaluation measures, publicly available data sets for multi-output regression real-world problems, as well as open-source software frameworks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Brazil, the consumption of extra-virgin olive oil (EVOO) is increasing annually, but there are no experimental studies concerning the phenolic compound contents of commercial EVOO. The aim of this work was to optimise the separation of 17 phenolic compounds already detected in EVOO. A Doehlert matrix experimental design was used, evaluating the effects of pH and electrolyte concentration. Resolution, runtime and migration time relative standard deviation values were evaluated. Derringer's desirability function was used to simultaneously optimise all 37 responses. The 17 peaks were separated in 19min using a fused-silica capillary (50μm internal diameter, 72cm of effective length) with an extended light path and 101.3mmolL(-1) of boric acid electrolyte (pH 9.15, 30kV). The method was validated and applied to 15 EVOO samples found in Brazilian supermarkets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The formation of mono-species biofilm (Listeria monocytogenes) and multi-species biofilms (Enterococcus faecium, Enterococcus faecalis, and L. monocytogenes) was evaluated. In addition, the effectiveness of sanitation procedures for the control of the multi-species biofilm also was evaluated. The biofilms were grown on stainless steel coupons at various incubation temperatures (7, 25 and 39°C) and contact times (0, 1, 2, 4, 6 and 8days). In all tests, at 7°C, the microbial counts were below 0.4 log CFU/cm(2) and not characteristic of biofilms. In mono-species biofilm, the counts of L. monocytogenes after 8days of contact were 4.1 and 2.8 log CFU/cm(2) at 25 and 39°C, respectively. In the multi-species biofilms, Enterococcus spp. were present at counts of 8 log CFU/cm(2) at 25 and 39°C after 8days of contact. However, the L. monocytogenes in multi-species biofilms was significantly affected by the presence of Enterococcus spp. and by temperature. At 25°C, the growth of L. monocytogenes biofilms was favored in multi-species cultures, with counts above 6 log CFU/cm(2) after 8days of contact. In contrast, at 39°C, a negative effect was observed for L. monocytogenes biofilm growth in mixed cultures, with a significant reduction in counts over time and values below 0.4 log CFU/cm(2) starting at day 4. Anionic tensioactive cleaning complemented with another procedure (acid cleaning, disinfection or acid cleaning+disinfection) eliminated the multi-species biofilms under all conditions tested (counts of all micro-organisms<0.4 log CFU/cm(2)). Peracetic acid was the most effective disinfectant, eliminating the multi-species biofilms under all tested conditions (counts of the all microorganisms <0.4 log CFU/cm(2)). In contrast, biguanide was the least effective disinfectant, failing to eliminate biofilms under all the test conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Extracts obtained from 57 marine-derived fungal strains were analyzed by HPLC-PDA, TLC and ¹H NMR. The analyses showed that the growth conditions affected the chemical profile of crude extracts. Furthermore, the majority of fungal strains which produced either bioactive of chemically distinctive crude extracts have been isolated from sediments or marine algae. The chemical investigation of the antimycobacterial and cytotoxic crude extract obtained from two strains of the fungus Beauveria felina have yielded cyclodepsipeptides related to destruxins. The present approach constitutes a valuable tool for the selection of fungal strains that produce chemically interesting or biologically active secondary metabolites.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper addresses the capacitated lot sizing problem (CLSP) with a single stage composed of multiple plants, items and periods with setup carry-over among the periods. The CLSP is well studied and many heuristics have been proposed to solve it. Nevertheless, few researches explored the multi-plant capacitated lot sizing problem (MPCLSP), which means that few solution methods were proposed to solve it. Furthermore, to our knowledge, no study of the MPCLSP with setup carry-over was found in the literature. This paper presents a mathematical model and a GRASP (Greedy Randomized Adaptive Search Procedure) with path relinking to the MPCLSP with setup carry-over. This solution method is an extension and adaptation of a previously adopted methodology without the setup carry-over. Computational tests showed that the improvement of the setup carry-over is significant in terms of the solution value with a low increase in computational time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a new design methodology for discrete multi-pumped Raman amplifier. In a multi-objective optimization scenario, in a first step the whole solution-space is inspected by a CW analytical formulation. Then, the most promising solutions are fully investigated by a rigorous numerical treatment and the Raman amplification performance is thus determined by the combination of analytical and numerical approaches. As an application of our methodology we designed an photonic crystal fiber Raman amplifier configuration which provides low ripple, high gain, clear eye opening and a low power penalty. The amplifier configuration also enables to fully compensate the dispersion introduced by a 70-km singlemode fiber in a 10 Gbit/s system. We have successfully obtained a configuration with 8.5 dB average gain over the C-band and 0.71 dB ripple with almost zero eye-penalty using only two pump lasers with relatively low pump power. (C) 2009 Optical Society of America