908 resultados para Machine Tools


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the application of wavelet processing in the domain of handwritten character recognition. To attain high recognition rate, robust feature extractors and powerful classifiers that are invariant to degree of variability of human writing are needed. The proposed scheme consists of two stages: a feature extraction stage, which is based on Haar wavelet transform and a classification stage that uses support vector machine classifier. Experimental results show that the proposed method is effective

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In our study we use a kernel based classification technique, Support Vector Machine Regression for predicting the Melting Point of Drug – like compounds in terms of Topological Descriptors, Topological Charge Indices, Connectivity Indices and 2D Auto Correlations. The Machine Learning model was designed, trained and tested using a dataset of 100 compounds and it was found that an SVMReg model with RBF Kernel could predict the Melting Point with a mean absolute error 15.5854 and Root Mean Squared Error 19.7576

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of conceptual knowledge systems specifically requests knowledge acquisition tools within the framework of formal concept analysis. In this paper, the existing tools are presented, and furhter developments are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis in Thermal Flow Drilling and Flowtap in thin metal sheet and pipes of copper and copper alloys had as objectives to know the comportment of copper and copper alloys sheet metal during the Thermal Flow Drill processes with normal tools, to know the best Speed and Feed machine data for the best bushing quality, to known the best Speed for Form Tapping processes and to know the best bush long in pure copper pipes for water solar interchange equipment. Thermal Flow Drilling (TFD) and Form Tapping (FT) is one of the research lines of the Institute of Production and Logistics (IPL) at University of Kassel. At December 1995, a work meeting of IPL, Santa Catarina University, Brazil, Buenos Aires University, Argentine, Tarapacá University (UTA), Chile members and the CEO of Flowdrill B.V. was held in Brazil. The group decided that the Manufacturing Laboratory (ML) of UTA would work with pure copper and brass alloys sheet metal and pure copper pipes in order to develop a water interchange solar heater. The Flowdrill BV Company sent tools to Tarapacá University in 1996. In 1999 IPL and the ML carried out an ALECHILE research project promoted by the DAAD and CONICyT in copper sheet metal and copper pipes and sheet metal a-brass alloys. The normal tools are lobed, conical tungsten carbide tool. When rotated at high speed and pressed with high axial force into sheet metal or thin walled tube generated heat softens the metal and allows the drill to feed forward produce a hole and simultaneously form a bushing from the displacement material. In the market exist many features but in this thesis is used short and longs normal tools of TFD. For reach the objectives it was takes as references four qualities of the frayed end bushing, where the best one is the quality class I. It was used pure copper and a-brass alloys sheet metals, with different thickness. It was used different TFD drills diameter for four thread type, from M-5 to M10. Similar to the Aluminium sheet metals studies it was used the predrilling processes with HSS drills around 30% of the TFD diameter (1,5 – 3,0 mm D). In the next step is used only 2,0 mm thick metal sheet, and 9,2 mm TFD diameter for M-10 thread. For the case of pure commercial copper pipes is used for ¾” inch diameter and 12, 8 mm (3/8”) TFD drill for holes for 3/8” pipes and different normal HSS drills for predrilling processes. The chemical sheet metal characteristics were takes as reference for the material behaviour. The Chilean pure copper have 99,35% of Cu and 0,163% of Zinc and the Chilean a-brass alloys have 75,6% of Cu and 24,0% of Zinc. It is used two German a-brass alloys; Nº1 have 61,6% of Cu, 36,03 % of Zinc and 2,2% of Pb and the German a-brass alloys Nº2 have 63,1% of Cu, 36,7% of Zinc and 0% of Pb. The equipments used were a HAAS CNC milling machine centre, a Kistler dynamometer, PC Pentium II, Acquisition card, TESTPOINT and XAct software, 3D measurement machine, micro hardness, universal test machine, and metallographic microscope. During the test is obtained the feed force and momentum curves that shows the material behaviour with TFD processes. In general it is take three phases. It was possible obtain the best machining data for the different sheet of copper and a-brass alloys thick of Chilean materials and bush quality class I. In the case of a-brass alloys, the chemical components and the TFD processes temperature have big influence. The temperature reach to 400º Celsius during the TFD processes and the a-brass alloys have some percents of Zinc the bush quality is class I. But when the a-brass alloys have some percents of Lead who have 200º C melting point is not possible to obtain a bush, because the Lead gasify and the metallographic net broke. During the TFD processes the recrystallization structures occur around the Copper and a-brass alloy bush, who gives more hardness in these zones. When the threads were produce with Form Tapping processes with Flowtap tools, this hardness amount gives a high limit load of the thread when hey are tested in a special support that was developed for it. For eliminated the predrilling processes with normal HSS drills it was developed a compound tool. With this new tool it was possible obtain the best machining data for quality class I bush. For the copper pipes it is made bush without predrilling and the quality class IV was obtained. When it is was used predrilling processes, quality classes I bush were obtained. Then with different HSS drill diameter were obtained different long bush, where were soldering with four types soldering materials between pipes with 3/8” in a big one as ¾”. Those soldering unions were tested by traction test and all the 3/8” pipes broken, and the soldering zone doesn’t have any problem. Finally were developed different solar water interchange heaters and tested. As conclusions, the present Thesis shows that the Thermal Flow Drilling in thinner metal sheets of cooper and cooper alloys needs a predrilling process for frayed end quality class I bushings, similar to thinner sheets of aluminium bushes. The compound tool developed could obtain quality class I bushings and excludes predrilling processes. The bush recrystalization, product of the friction between the tool and the material, the hardness grows and it is advantageous for the Form Tapping. The methodology developed for commercial copper pipes permits to built water solar interchange heaters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Die zunehmende Vernetzung der Informations- und Kommunikationssysteme führt zu einer weiteren Erhöhung der Komplexität und damit auch zu einer weiteren Zunahme von Sicherheitslücken. Klassische Schutzmechanismen wie Firewall-Systeme und Anti-Malware-Lösungen bieten schon lange keinen Schutz mehr vor Eindringversuchen in IT-Infrastrukturen. Als ein sehr wirkungsvolles Instrument zum Schutz gegenüber Cyber-Attacken haben sich hierbei die Intrusion Detection Systeme (IDS) etabliert. Solche Systeme sammeln und analysieren Informationen von Netzwerkkomponenten und Rechnern, um ungewöhnliches Verhalten und Sicherheitsverletzungen automatisiert festzustellen. Während signatur-basierte Ansätze nur bereits bekannte Angriffsmuster detektieren können, sind anomalie-basierte IDS auch in der Lage, neue bisher unbekannte Angriffe (Zero-Day-Attacks) frühzeitig zu erkennen. Das Kernproblem von Intrusion Detection Systeme besteht jedoch in der optimalen Verarbeitung der gewaltigen Netzdaten und der Entwicklung eines in Echtzeit arbeitenden adaptiven Erkennungsmodells. Um diese Herausforderungen lösen zu können, stellt diese Dissertation ein Framework bereit, das aus zwei Hauptteilen besteht. Der erste Teil, OptiFilter genannt, verwendet ein dynamisches "Queuing Concept", um die zahlreich anfallenden Netzdaten weiter zu verarbeiten, baut fortlaufend Netzverbindungen auf, und exportiert strukturierte Input-Daten für das IDS. Den zweiten Teil stellt ein adaptiver Klassifikator dar, der ein Klassifikator-Modell basierend auf "Enhanced Growing Hierarchical Self Organizing Map" (EGHSOM), ein Modell für Netzwerk Normalzustand (NNB) und ein "Update Model" umfasst. In dem OptiFilter werden Tcpdump und SNMP traps benutzt, um die Netzwerkpakete und Hostereignisse fortlaufend zu aggregieren. Diese aggregierten Netzwerkpackete und Hostereignisse werden weiter analysiert und in Verbindungsvektoren umgewandelt. Zur Verbesserung der Erkennungsrate des adaptiven Klassifikators wird das künstliche neuronale Netz GHSOM intensiv untersucht und wesentlich weiterentwickelt. In dieser Dissertation werden unterschiedliche Ansätze vorgeschlagen und diskutiert. So wird eine classification-confidence margin threshold definiert, um die unbekannten bösartigen Verbindungen aufzudecken, die Stabilität der Wachstumstopologie durch neuartige Ansätze für die Initialisierung der Gewichtvektoren und durch die Stärkung der Winner Neuronen erhöht, und ein selbst-adaptives Verfahren eingeführt, um das Modell ständig aktualisieren zu können. Darüber hinaus besteht die Hauptaufgabe des NNB-Modells in der weiteren Untersuchung der erkannten unbekannten Verbindungen von der EGHSOM und der Überprüfung, ob sie normal sind. Jedoch, ändern sich die Netzverkehrsdaten wegen des Concept drif Phänomens ständig, was in Echtzeit zur Erzeugung nicht stationärer Netzdaten führt. Dieses Phänomen wird von dem Update-Modell besser kontrolliert. Das EGHSOM-Modell kann die neuen Anomalien effektiv erkennen und das NNB-Model passt die Änderungen in Netzdaten optimal an. Bei den experimentellen Untersuchungen hat das Framework erfolgversprechende Ergebnisse gezeigt. Im ersten Experiment wurde das Framework in Offline-Betriebsmodus evaluiert. Der OptiFilter wurde mit offline-, synthetischen- und realistischen Daten ausgewertet. Der adaptive Klassifikator wurde mit dem 10-Fold Cross Validation Verfahren evaluiert, um dessen Genauigkeit abzuschätzen. Im zweiten Experiment wurde das Framework auf einer 1 bis 10 GB Netzwerkstrecke installiert und im Online-Betriebsmodus in Echtzeit ausgewertet. Der OptiFilter hat erfolgreich die gewaltige Menge von Netzdaten in die strukturierten Verbindungsvektoren umgewandelt und der adaptive Klassifikator hat sie präzise klassifiziert. Die Vergleichsstudie zwischen dem entwickelten Framework und anderen bekannten IDS-Ansätzen zeigt, dass der vorgeschlagene IDSFramework alle anderen Ansätze übertrifft. Dies lässt sich auf folgende Kernpunkte zurückführen: Bearbeitung der gesammelten Netzdaten, Erreichung der besten Performanz (wie die Gesamtgenauigkeit), Detektieren unbekannter Verbindungen und Entwicklung des in Echtzeit arbeitenden Erkennungsmodells von Eindringversuchen.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Machine translation has been a particularly difficult problem in the area of Natural Language Processing for over two decades. Early approaches to translation failed since interaction effects of complex phenomena in part made translation appear to be unmanageable. Later approaches to the problem have succeeded (although only bilingually), but are based on many language-specific rules of a context-free nature. This report presents an alternative approach to natural language translation that relies on principle-based descriptions of grammar rather than rule-oriented descriptions. The model that has been constructed is based on abstract principles as developed by Chomsky (1981) and several other researchers working within the "Government and Binding" (GB) framework. Thus, the grammar is viewed as a modular system of principles rather than a large set of ad hoc language-specific rules.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The dataflow model of computation exposes and exploits parallelism in programs without requiring programmer annotation; however, instruction- level dataflow is too fine-grained to be efficient on general-purpose processors. A popular solution is to develop a "hybrid'' model of computation where regions of dataflow graphs are combined into sequential blocks of code. I have implemented such a system to allow the J-Machine to run Id programs, leaving exposed a high amount of parallelism --- such as among loop iterations. I describe this system and provide an analysis of its strengths and weaknesses and those of the J-Machine, along with ideas for improvement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In early stages of architectural design, as in other design domains, the language used is often very abstract. In architectural design, for example, architects and their clients use experiential terms such as "private" or "open" to describe spaces. If we are to build programs that can help designers during this early-stage design, we must give those programs the capability to deal with concepts on the level of such abstractions. The work reported in this thesis sought to do that, focusing on two key questions: How are abstract terms such as "private" and "open" translated into physical form? How might one build a tool to assist designers with this process? The Architect's Collaborator (TAC) was built to explore these issues. It is a design assistant that supports iterative design refinement, and that represents and reasons about how experiential qualities are manifested in physical form. Given a starting design and a set of design goals, TAC explores the space of possible designs in search of solutions that satisfy the goals. It employs a strategy we've called dependency-directed redesign: it evaluates a design with respect to a set of goals, then uses an explanation of the evaluation to guide proposal and refinement of repair suggestions; it then carries out the repair suggestions to create new designs. A series of experiments was run to study TAC's behavior. Issues of control structure, goal set size, goal order, and modification operator capabilities were explored. In addition, TAC's use as a design assistant was studied in an experiment using a house in the process of being redesigned. TAC's use as an analysis tool was studied in an experiment using Frank Lloyd Wright's Prairie houses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis, I designed and implemented a virtual machine (VM) for a monomorphic variant of Athena, a type-omega denotational proof language (DPL). This machine attempts to maintain the minimum state required to evaluate Athena phrases. This thesis also includes the design and implementation of a compiler for monomorphic Athena that compiles to the VM. Finally, it includes details on my implementation of a read-eval-print loop that glues together the VM core and the compiler to provide a full, user-accessible interface to monomorphic Athena. The Athena VM provides the same basis for DPLs that the SECD machine does for pure, functional programming and the Warren Abstract Machine does for Prolog.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We compare Naive Bayes and Support Vector Machines on the task of multiclass text classification. Using a variety of approaches to combine the underlying binary classifiers, we find that SVMs substantially outperform Naive Bayes. We present full multiclass results on two well-known text data sets, including the lowest error to date on both data sets. We develop a new indicator of binary performance to show that the SVM's lower multiclass error is a result of its improved binary performance. Furthermore, we demonstrate and explore the surprising result that one-vs-all classification performs favorably compared to other approaches even though it has no error-correcting properties.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Support Vector Machines Regression (SVMR) is a regression technique which has been recently introduced by V. Vapnik and his collaborators (Vapnik, 1995; Vapnik, Golowich and Smola, 1996). In SVMR the goodness of fit is measured not by the usual quadratic loss function (the mean square error), but by a different loss function called Vapnik"s $epsilon$- insensitive loss function, which is similar to the "robust" loss functions introduced by Huber (Huber, 1981). The quadratic loss function is well justified under the assumption of Gaussian additive noise. However, the noise model underlying the choice of Vapnik's loss function is less clear. In this paper the use of Vapnik's loss function is shown to be equivalent to a model of additive and Gaussian noise, where the variance and mean of the Gaussian are random variables. The probability distributions for the variance and mean will be stated explicitly. While this work is presented in the framework of SVMR, it can be extended to justify non-quadratic loss functions in any Maximum Likelihood or Maximum A Posteriori approach. It applies not only to Vapnik's loss function, but to a much broader class of loss functions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The log-ratio methodology makes available powerful tools for analyzing compositional data. Nevertheless, the use of this methodology is only possible for those data sets without null values. Consequently, in those data sets where the zeros are present, a previous treatment becomes necessary. Last advances in the treatment of compositional zeros have been centered especially in the zeros of structural nature and in the rounded zeros. These tools do not contemplate the particular case of count compositional data sets with null values. In this work we deal with \count zeros" and we introduce a treatment based on a mixed Bayesian-multiplicative estimation. We use the Dirichlet probability distribution as a prior and we estimate the posterior probabilities. Then we apply a multiplicative modi¯cation for the non-zero values. We present a case study where this new methodology is applied. Key words: count data, multiplicative replacement, composition, log-ratio analysis

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Blogging has become one of the key ingredients of the so-called socials networks. This phenomenon has indeed invaded the world of education. Connections between people, comments on each other posts, and assessment of innovation are usually interesting characteristics of blogs related to students and scholars. Blogs have become a kind of new form of authority, bringing about (divergent) discussions which lead to creation of knowledge. The use of blogs as an innovative, educational tool is not at all new. However, their use in universities is not very widespread yet. Blogging for personal affairs is rather commonplace, but blogging for professional affairs – teaching, research and service, is scarce, despite the availability of ready-to-use, free tools. Unfortunately, Information Society has not reached yet enough some universities: not only are (student) blogs scarcely used as an educational tool, but it is quite rare to find a blog written by University professors. The Institute of Computational Chemistry of the University of Girona and the Department of Chemistry of the Universitat Autònoma de Barcelona has joined forces to create “InnoCiència”, a new Group on Digital Science Communitation. This group, formed by ca. ten researchers, has promoted the use of blogs, twitters. wikis and other tools of Web 2.0 in activities in Catalonia concerning the dissemination of Science, like Science Week, Open Day or Researchers’ Night. Likewise, its members promote use of social networking tools in chemistry- and communication-related courses. This communication explains the outcome of social-network experiences with teaching undergraduate students and organizing research communication events. We provide live, hands-on examples and interactive ground to show how blogs and twitters can be used to enhance the yield of teaching and research. Impact of blogging and other social networking tools on the outcome of the learning process is very depending on the target audience and the environmental conditions. A few examples are provided and some proposals to use these techniques efficiently to help students are hinted

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a tool for the analysis and regeneration of Web contents, implemented through XML and Java. At the moment, the Web content delivery from server to clients is carried out without taking into account clients' characteristics. Heterogeneous and diverse characteristics, such as user's preferences, different capacities of the client's devices, different types of access, state of the network and current load on the server, directly affect the behavior of Web services. On the other hand, the growing use of multimedia objects in the design of Web contents is made without taking into account this diversity and heterogeneity. It affects, even more, the appropriate content delivery. Thus, the objective of the presented tool is the treatment of Web pages taking into account the mentioned heterogeneity and adapting contents in order to improve the performance on the Web