861 resultados para New cutting tool
Resumo:
Ceramic parts are increasingly replacing metal parts due to their excellent physical, chemical and mechanical properties, however they also make them difficult to manufacture by traditional machining methods. The developments carried out in this work are used to estimate tool wear during the grinding of advanced ceramics. The learning process was fed with data collected from a surface grinding machine with tangential diamond wheel and alumina ceramic test specimens, in three cutting configurations: with depths of cut of 120 mu m, 70 mu m and 20 mu m. The grinding wheel speed was 35m/s and the table speed 2.3m/s. Four neural models were evaluated, namely: Multilayer Perceptron, Radial Basis Function, Generalized Regression Neural Networks and the Adaptive Neuro-Fuzzy Inference System. The models'performance evaluation routines were executed automatically, testing all the possible combinations of inputs, number of neurons, number of layers, and spreading. The computational results reveal that the neural models were highly successful in estimating tool wear, since the errors were lower than 4%.
Resumo:
The grinding operation gives workpieces their final finish, minimizing surface roughness through the interaction between the abrasive grains of a tool (grinding wheel) and the workpiece. However, excessive grinding wheel wear due to friction renders the tool unsuitable for further use, thus requiring the dressing operation to remove and/or sharpen the cutting edges of the worn grains to render them reusable. The purpose of this study was to monitor the dressing operation using the acoustic emission (AE) signal and statistics derived from this signal, classifying the grinding wheel as sharp or dull by means of artificial neural networks. An aluminum oxide wheel installed on a surface grinding machine, a signal acquisition system, and a single-point dresser were used in the experiments. Tests were performed varying overlap ratios and dressing depths. The root mean square values and two additional statistics were calculated based on the raw AE data. A multilayer perceptron neural network was used with the Levenberg-Marquardt learning algorithm, whose inputs were the aforementioned statistics. The results indicate that this method was successful in classifying the conditions of the grinding wheel in the dressing process, identifying the tool as "sharp''(with cutting capacity) or "dull''(with loss of cutting capacity), thus reducing the time and cost of the operation and minimizing excessive removal of abrasive material from the grinding wheel.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
This paper deals with transient stability analysis based on time domain simulation on vector processing. This approach requires the solution of a set of differential equations in conjunction of another set of algebraic equations. The solution of the algebraic equations has presented a scalar as sequential set of tasks, and the solution of these equations, on vector computers, has required much more investigations to speedup the simulations. Therefore, the main objective of this paper has been to present methods to solve the algebraic equations using vector processing. The results, using a GRAY computer, have shown that on-line transient stability assessment is feasible.
Resumo:
Pós-graduação em Engenharia Mecânica - FEB
Resumo:
Pós-graduação em Engenharia Mecânica - FEB
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Conselho Nacional de Desenvolvimento Cientifico e Tecnológico (CNPq)
Resumo:
It is marvelously fitting that we gather here in Nebraska City on this lovely fall afternoon to officially celebrate this wonderful new center. Fall is a traditional-time of harvest in Nebraska, and for many of us there is a deep and abiding satisfaction in bringing a good crop to fruition. Apple harvests at Nebraska City orchards long have brought visitors here each year for fresh fruit and cider.
Resumo:
The 3PL model is a flexible and widely used tool in assessment. However, it suffers from limitations due to its need for large sample sizes. This study introduces and evaluates the efficacy of a new sample size augmentation technique called Duplicate, Erase, and Replace (DupER) Augmentation through a simulation study. Data are augmented using several variations of DupER Augmentation (based on different imputation methodologies, deletion rates, and duplication rates), analyzed in BILOG-MG 3, and results are compared to those obtained from analyzing the raw data. Additional manipulated variables include test length and sample size. Estimates are compared using seven different evaluative criteria. Results are mixed and inconclusive. DupER augmented data tend to result in larger root mean squared errors (RMSEs) and lower correlations between estimates and parameters for both item and ability parameters. However, some DupER variations produce estimates that are much less biased than those obtained from the raw data alone. For one DupER variation, it was found that DupER produced better results for low-ability simulees and worse results for those with high abilities. Findings, limitations, and recommendations for future studies are discussed. Specific recommendations for future studies include the application of Duper Augmentation (1) to empirical data, (2) with additional IRT models, and (3) the analysis of the efficacy of the procedure for different item and ability parameter distributions.
Resumo:
In the first paper presented to you today by Dr. Spencer, an expert in the Animal Biology field and an official authority at the same time, you heard about the requirements imposed on a chemical in order to pass the different official hurdles before it ever will be accepted as a proven tool in wildlife management. Many characteristics have to be known and highly sophisticated tests have to be run. In many instances the governmental agency maintains its own screening, testing or analytical programs according to standard procedures. It would be impossible, however, for economic and time reasons to work out all the data necessary for themselves. They, therefore, depend largely on the information furnished by the individual industry which naturally has to be established as conscientiously as possible. This, among other things, Dr. Spencer has made very clear; and this is also what makes quite a few headaches for the individual industry, but I am certainly not speaking only for myself in saying that Industry fully realizes this important role in developing materials for vertebrate control and the responsibilities lying in this. This type of work - better to say cooperative work with the official institutions - is, however, only one part and for the most of it, the smallest part of work which Industry pays to the development of compounds for pest control. It actually refers only to those very few compounds which are known to be effective. But how to get to know about their properties in the first place? How does Industry make the selection from the many thousands of compounds synthesized each year? This, by far, creates the biggest problems, at least from the scientific and technical standpoint. Let us rest here for a short while and think about the possible ways of screening and selecting effective compounds. Basically there are two different ways. One is the empirical way of screening as big a number of compounds as possible under the supposition that with the number of incidences the chances for a "hit" increase, too. You can also call this type of approach the statistical or the analytical one, the mass screening of new, mostly unknown candidate materials. This type of testing can only be performed by a producer of many new materials,that means by big industries. It requires a tremendous investment in personnel, time and equipment and is based on highly simplified but indicative test methods, the results of which would have to be reliable and representative for practical purposes. The other extreme is the intellectual way of theorizing effective chemical configurations. Defenders of this method claim to now or later be able to predict biological effectiveness on the basis of the chemical structure or certain groups in it. Certain pre-experience should be necessary, that means knowledge of the importance of certain molecular requirements, then the detection of new and effective complete molecules is a matter of coordination to be performed by smart people or computers. You can also call this method the synthetical or coordinative method.
Resumo:
Chimpanzees have been the traditional referential models for investigating human evolution and stone tool use by hominins. We enlarge this comparative scenario by describing normative use of hammer stones and anvils in two wild groups of bearded capuchin monkeys (Cebus libidinosus) over one year. We found that most of the individuals habitually use stones and anvils to crack nuts and other encased food items. Further, we found that in adults (1) males use stone tools more frequently than females, (2) males crack high resistance nuts more frequently than females, (3) efficiency at opening a food by percussive tool use varies according to the resistance of the encased food, (4) heavier individuals are more efficient at cracking high resistant nuts than smaller individuals, and (5) to crack open encased foods, both sexes select hammer stones on the basis of material and weight. These findings confirm and extend previous experimental evidence concerning tool selectivity in wild capuchin monkeys (Visalberghi et al., 2009b; Fragaszy et al., 2010b). Male capuchins use tools more frequently than females and body mass is the best predictor of efficiency, but the sexes do not differ in terms of efficiency. We argue that the contrasting pattern of sex differences in capuchins compared with chimpanzees, in which females use tools more frequently and more skillfully than males, may have arisen from the degree of sexual dimorphism in body size of the two species, which is larger in capuchins than in chimpanzees. Our findings show the importance of taking sex and body mass into account as separate variables to assess their role in tool use. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
The enzyme purine nucleoside phosphorylase (PNP) is a target for the discovery of new lead compounds employed on the treatment severe T-cell mediated disorders. Within this context, the development of new, direct, and reliable methods for ligands screening is an important task. This paper describes the preparation of fused silica capillaries human PNP (HsPNP) immobilized enzyme reactor (IMER). The activity of the obtained IMER is monitored on line in a multidimensional liquid chromatography system, by the quantification of the product formed throughout the enzymatic reaction. The Km value for the immobilized enzyme was about twofold higher than that measured for the enzyme in solution (255 +/- 29.2 mu M and 133 +/- 114.9 mu M, respectively). A new fourth-generation immucillin derivative (DI4G: IC50 = 40.6 +/- 0.36 nM), previously identified and characterized in HsPNP free enzyme assays, was used to validate the IMER as a screening method for HsPNP ligands. The validated method was also used for mechanistic studies with this inhibitor. This new approach is a valuable tool to PNP ligand screening, since it directly measures the hypoxanthine released by inosine phosphorolysis, thus furnishing more reliable results than those one used in a coupled enzymatic spectrophotometric assay. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
This paper reports an experimental method to estimate the convective heat transfer of cutting fluids in a laminar flow regime applied on a thin steel plate. The heat source provided by the metal cutting was simulated by electrical heating of the plate. Three different cooling conditions were evaluated: a dry cooling system, a flooded cooling system and a minimum quantity of lubrication cooling system, as well as two different cutting fluids for the last two systems. The results showed considerable enhancement of convective heat transfer using the flooded system. For the dry and minimum quantity of lubrication systems, the heat conduction inside the body was much faster than the heat convection away from its surface. In addition, using the Biot number, the possible models were analyzed for conduction heat problems for each experimental condition tested.