977 resultados para Burroughs D-machine (Computer)
Resumo:
Different methods of cutting fluid application are used on turning of a difficult-tomachine steel (SAE EV-8). A semi-synthetic cutting fluid was applied using a conventional method, minimum quantity of cutting fluid (MQCF), and pulverization. By the minimum quantity method was also applied a lubricant of vegetable oil (MQL). Thereafter, a cutting fluid jet under high pressure (3.0 MPa) was singly applied in the following regions: chip-tool interface; top surface of the chip; and tool-workpiece contact. Two other methods were used: an interflow between conventional application and chip-tool interface jet and, finally, three jets simultaneously applied. In order to carry out these tests, it was necessary to set up a high pressure system using a piston pump for generating a cutting fluid jet, a Venturi for fluid application (MQCF and MQL), and a nozzle for cutting fluid pulverization. The output variables analyzed included tool life, surface roughness, cutting tool temperature, cutting force, chip form, chip compression rate and machined specimen microstructure. It can be observed that the tool life increases and the cutting force decreases with the application of cutting fluid jet, mainly when it is directed to the chip-tool interface. Excluding the methods involving jet fluid, the conventional method seems to be more efficient than other methods of low pressure. © (2013) Trans Tech Publications, Switzerland.
Resumo:
Pós-graduação em Ciência Florestal - FCA
Resumo:
Modeling is a step to perform a finite element analysis. Different methods of model construction are reported in literature, as the Bio-CAD modeling. The purpose of this study was to perform a model evaluation and application using two methods of Bio-CAD modeling from human edentulous hemi-mandible on the finite element analysis. From CT scans of dried human skull was reconstructed a stereolithographic model. Two methods of modeling were performed: STL conversion approach (Model 1) associated to STL simplification and reverse engineering approach (Model 2). For finite element analysis was used the action of lateral pterygoid muscle as loading condition to assess total displacement (D), equivalent von-Mises stress (VM) and maximum principal stress (MP). Two models presented differences on the geometry regarding surface number (1834 (model 1); 282 (model 2)). Were observed differences in finite element mesh regarding element number (30428 nodes/16683 elements (model 1); 15801 nodes/8410 elements (model 2). D, VM and MP stress areas presented similar distribution in two models. The values were different regarding maximum and minimum values of D (ranging 0-0.511 mm (model 1) and 0-0.544 mm (model 2), VM stress (6.36E-04-11.4 MPa (model 1) and 2.15E-04-14.7 MPa (model 2) and MP stress (-1.43-9.14 MPa (model 1) and -1.2-11.6 MPa (model 2). From two methods of Bio-CAD modeling, the reverse engineering presented better anatomical representation compared to the STL conversion approach. The models presented differences in the finite element mesh, total displacement and stress distribution.
Resumo:
This article describes the use of Artificial Intelligence (IA) techniques applied in cells of a manufacturing system. Machine Vision was used to identify pieces and their positions of two different products to be assembled in the same productive line. This information is given as input for an IA planner embedded in the manufacturing system. Therefore, initial and final states are sent automatically to the planner capable to generate assembly plans for a robotic cell, in real time.
Resumo:
Active machine learning algorithms are used when large numbers of unlabeled examples are available and getting labels for them is costly (e.g. requiring consulting a human expert). Many conventional active learning algorithms focus on refining the decision boundary, at the expense of exploring new regions that the current hypothesis misclassifies. We propose a new active learning algorithm that balances such exploration with refining of the decision boundary by dynamically adjusting the probability to explore at each step. Our experimental results demonstrate improved performance on data sets that require extensive exploration while remaining competitive on data sets that do not. Our algorithm also shows significant tolerance of noise.
Resumo:
One problem with using component-based software development approach is that once software modules are reused over generations of products, they form legacy structures that can be challenging to understand, making validating these systems difficult. Therefore, tools and methodologies that enable engineers to see interactions of these software modules will enhance their ability to make these software systems more dependable. To address this need, we propose SimSight, a framework to capture dynamic call graphs in Simics, a widely adopted commercial full-system simulator. Simics is a software system that simulates complete computer systems. Thus, it performs nearly identical tasks to a real system but at a much lower speed while providing greater execution observability. We have implemented SimSight to generate dynamic call graphs of statically and dynamically linked functions in x86/Linux environment. A case study illustrates how we can use SimSight to identify sources of software errors. We then evaluate its performance using 12 integer programs from SPEC CPU2006 benchmark suite.
Resumo:
"How large a sample is needed to survey the bird damage to corn in a county in Ohio or New Jersey or South Dakota?" Like those in the Bureau of Sport Fisheries and Wildlife and the U.S.D.A. who have been faced with a question of this sort we found only meager information on which to base an answer, whether the problem related to a county in Ohio or to one in New Jersey, or elsewhere. Many sampling methods and rates of sampling did yield reliable estimates but the judgment was often intuitive or based on the reasonableness of the resulting data. Later, when planning the next study or survey, little additional information was available on whether 40 samples of 5 ears each or 5 samples of 200 ears should be examined, i.e., examination of a large number of small samples or a small number of large samples. What information is needed to make a reliable decision? Those of us involved with the Agricultural Experiment Station regional project concerned with the problems of bird damage to crops, known as NE-49, thought we might supply an ans¬wer if we had a corn field in which all the damage was measured. If all the damage were known, we could then sample this field in various ways and see how the estimates from these samplings compared to the actual damage and pin-point the best and most accurate sampling procedure. Eventually the investigators in four states became involved in this work1 and instead of one field we were able to broaden the geographical base by examining all the corn ears in 2 half-acre sections of fields in each state, 8 sections in all. When the corn had matured well past the dough stage, damage on each corn ear was assessed, without removing the ear from the stalk, by visually estimating the percent of the kernel surface which had been destroyed and rating it in one of 5 damage categories. Measurements (by row-centimeters) of the rows of kernels pecked by birds also were made on selected ears representing all categories and all parts of each field section. These measurements provided conversion factors that, when fed into a computer, were applied to the more than 72,000 visually assessed ears. The machine now had in its memory and could supply on demand a map showing each ear, its location and the intensity of the damage.
Resumo:
This work describes a methodology to simulate free surface incompressible multiphase flows. This novel methodology allows the simulation of multiphase flows with an arbitrary number of phases, each of them having different densities and viscosities. Surface and interfacial tension effects are also included. The numerical technique is based on the GENSMAC front-tracking method. The velocity field is computed using a finite-difference discretization of a modification of the NavierStokes equations. These equations together with the continuity equation are solved for the two-dimensional multiphase flows, with different densities and viscosities in the different phases. The governing equations are solved on a regular Eulerian grid, and a Lagrangian mesh is employed to track free surfaces and interfaces. The method is validated by comparing numerical with analytic results for a number of simple problems; it was also employed to simulate complex problems for which no analytic solutions are available. The method presented in this paper has been shown to be robust and computationally efficient. Copyright (c) 2012 John Wiley & Sons, Ltd.
Resumo:
Surveillance Levels (SLs) are categories for medical patients (used in Brazil) that represent different types of medical recommendations. SLs are defined according to risk factors and the medical and developmental history of patients. Each SL is associated with specific educational and clinical measures. The objective of the present paper was to verify computer-aided, automatic assignment of SLs. The present paper proposes a computer-aided approach for automatic recommendation of SLs. The approach is based on the classification of information from patient electronic records. For this purpose, a software architecture composed of three layers was developed. The architecture is formed by a classification layer that includes a linguistic module and machine learning classification modules. The classification layer allows for the use of different classification methods, including the use of preprocessed, normalized language data drawn from the linguistic module. We report the verification and validation of the software architecture in a Brazilian pediatric healthcare institution. The results indicate that selection of attributes can have a great effect on the performance of the system. Nonetheless, our automatic recommendation of surveillance level can still benefit from improvements in processing procedures when the linguistic module is applied prior to classification. Results from our efforts can be applied to different types of medical systems. The results of systems supported by the framework presented in this paper may be used by healthcare and governmental institutions to improve healthcare services in terms of establishing preventive measures and alerting authorities about the possibility of an epidemic.
Resumo:
[EN]Re-identi fication is commonly accomplished using appearance features based on salient points and color information. In this paper, we make an study on the use of di fferent features exclusively obtained from depth images captured with RGB-D cameras. The results achieved, using simple geometric features extracted in a top-view setup, seem to provide useful descriptors for the re-identi fication task.
Resumo:
Interactive theorem provers (ITP for short) are tools whose final aim is to certify proofs written by human beings. To reach that objective they have to fill the gap between the high level language used by humans for communicating and reasoning about mathematics and the lower level language that a machine is able to “understand” and process. The user perceives this gap in terms of missing features or inefficiencies. The developer tries to accommodate the user requests without increasing the already high complexity of these applications. We believe that satisfactory solutions can only come from a strong synergy between users and developers. We devoted most part of our PHD designing and developing the Matita interactive theorem prover. The software was born in the computer science department of the University of Bologna as the result of composing together all the technologies developed by the HELM team (to which we belong) for the MoWGLI project. The MoWGLI project aimed at giving accessibility through the web to the libraries of formalised mathematics of various interactive theorem provers, taking Coq as the main test case. The motivations for giving life to a new ITP are: • study the architecture of these tools, with the aim of understanding the source of their complexity • exploit such a knowledge to experiment new solutions that, for backward compatibility reasons, would be hard (if not impossible) to test on a widely used system like Coq. Matita is based on the Curry-Howard isomorphism, adopting the Calculus of Inductive Constructions (CIC) as its logical foundation. Proof objects are thus, at some extent, compatible with the ones produced with the Coq ITP, that is itself able to import and process the ones generated using Matita. Although the systems have a lot in common, they share no code at all, and even most of the algorithmic solutions are different. The thesis is composed of two parts where we respectively describe our experience as a user and a developer of interactive provers. In particular, the first part is based on two different formalisation experiences: • our internship in the Mathematical Components team (INRIA), that is formalising the finite group theory required to attack the Feit Thompson Theorem. To tackle this result, giving an effective classification of finite groups of odd order, the team adopts the SSReflect Coq extension, developed by Georges Gonthier for the proof of the four colours theorem. • our collaboration at the D.A.M.A. Project, whose goal is the formalisation of abstract measure theory in Matita leading to a constructive proof of Lebesgue’s Dominated Convergence Theorem. The most notable issues we faced, analysed in this part of the thesis, are the following: the difficulties arising when using “black box” automation in large formalisations; the impossibility for a user (especially a newcomer) to master the context of a library of already formalised results; the uncomfortable big step execution of proof commands historically adopted in ITPs; the difficult encoding of mathematical structures with a notion of inheritance in a type theory without subtyping like CIC. In the second part of the manuscript many of these issues will be analysed with the looking glasses of an ITP developer, describing the solutions we adopted in the implementation of Matita to solve these problems: integrated searching facilities to assist the user in handling large libraries of formalised results; a small step execution semantic for proof commands; a flexible implementation of coercive subtyping allowing multiple inheritance with shared substructures; automatic tactics, integrated with the searching facilities, that generates proof commands (and not only proof objects, usually kept hidden to the user) one of which specifically designed to be user driven.
Resumo:
L’ecografia è la metodica diagnostica più utilizzata come screening e follow-up nei pazienti epatopatici con o senza lesioni focali e questo grazie alle sue peculiari caratteristiche, che sono date dall’essere real-time, maneggevole, priva di radiazioni ionizzanti e con bassi costi. Tuttavia tale metodica se confrontata con la TC o la RMN, può avere importanti limiti, quali l’impossibilità di visualizzare piccole lesioni localizzate in aree anatomicamente “difficili” o in pazienti obesi, che sono già state identificate con altre tecniche, come la TC o la RMN. Per superare queste limitazioni sono stati introdotti dei sistemi di “fusione d’immagine” che consentono di sincronizzare in tempo reale una metodica real time con bassa risoluzione spaziale come l’ecografia ed una statica ad alta risoluzione come la TC o la RMN. Ciò si ottiene creando attorno al paziente un piccolo campo elettromagnetico costituito da un generatore e da un rilevatore applicato al trasduttore ecografico ed introducendo in un computer abbinato all’ecografo il “volume rendering” dell’addome del paziente ottenuto mediante TC multistrato o RM. Il preciso “ appaiamento spaziale “ delle due metodiche si ottiene individuando in entrambe lo stesso piano assiale di riferimento e almeno 3-4 punti anatomici interni. Tale sistema di fusione d’immagine potrebbe essere molto utile in campo epatologico nella diagnostica non invasiva del piccolo epatocarcinoma, che secondo le ultime linee guida, nei noduli di dimensioni fra 1 e 2 cm, richiede una concordanza nel comportamento contrastografico della lesione in almeno due tecniche d’immagine. Lo scopo del nostro lavoro è stato pertanto quello di valutare, in pazienti epatopatici, il contributo che tale sistema può dare nell’identificazione e caratterizzazione di lesioni inferiori a 20 mm, che erano già state identificate alla TC o alla RMN come noduli sospetti per HCC, ma che non erano stati visualizzati in ecografia convenzionale. L’eventuale re-identificazione con l’ecografia convenzionale dei noduli sospetti per essere HCC, può permettere di evitare, alla luce dei criteri diagnostici non invasivi un’ ulteriore tecnica d’immagine ed eventualmente la biopsia. Pazienti e Metodi: 17 pazienti cirrotici (12 Maschi; 5 Femmine), con età media di 68.9 +/- 6.2 (SD) anni, in cui la TC e la RMN con mezzo di contrasto avevano identificato 20 nuove lesioni focali epatiche, inferiori a 20 mm (13,6 +/- 3,6 mm), sospette per essere epatocarcinomi (HCC), ma non identificate all’ecografia basale (eseguita in cieco rispetto alla TC o alla RMN) sono stati sottoposti ad ecografia senza e con mezzo di contrasto, focalizzata su una zona bersaglio identificata tramite il sistema di fusione d’immagini, che visualizza simultaneamente le immagini della TC e della RMN ricostruite in modalità bidimensionale ( 2D), tridimensionale ( 3 D) e real-time. La diagnosi finale era stata stabilita attraverso la presenza di una concordanza diagnostica, secondo le linee guida internazionali o attraverso un follow-up nei casi di discordanza. Risultati: Una diagnosi non invasiva di HCC è stata raggiunta in 15/20 lesioni, inizialmente sospettate di essere HCC. Il sistema di fusione ha identificato e mostrato un comportamento contrastografico tipico in 12/15 noduli di HCC ( 80%) mentre 3/15 HCC (20%) non sono stati identificati con il sistema di fusione d’immagine. Le rimanenti 5/20 lesioni non sono state visualizzate attraverso i sistemi di fusione d’immagine ed infine giudicate come falsi positivi della TC e della RMN, poiché sono scomparse nei successivi mesi di follow-up e rispettivamente dopo tre, sei, nove, dodici e quindici mesi. Conclusioni: I nostri risultati preliminari mostrano che la combinazione del sistema di fusione dell’immagine associata all’ecografia senza e con mezzo di contrasto (CEUS), migliora il potenziale dell’ecografia nell’identificazione e caratterizzazione dell’HCC su fegato cirrotico, permettendo il raggiungimento di una diagnosi, secondo criteri non invasivi e slatentizzazndo casi di falsi positivi della TC e della RMN.
Resumo:
In dieser Arbeit wurden die Phasenübergänge einer einzelnen Polymerkette mit Hilfe der Monte Carlo Methode untersucht. Das Bondfluktuationsmodell wurde zur Simulation benutzt, wobei ein attraktives Kastenpotential zwischen allen Monomeren der Polymerkette gewirkt hat. Drei Arten von Bewegungen sind eingeführt worden, um die Polymerkette richtig zu relaxieren. Diese sind die Hüpfbewegung, die Reptationsbewegung und die Pivotbewegung. Um die Volumenausschlußwechselwirkung zu prüfen und um die Anzahl der Nachbarn jedes Monomers zu bestimmen ist ein hierarchischer Suchalgorithmus eingeführt worden. Die Zustandsdichte des Modells ist mittels des Wang-Landau Algorithmus bestimmt worden. Damit sind thermodynamische Größen berechnet worden, um die Phasenübergänge der einzelnen Polymerkette zu studieren. Wir haben zuerst eine freie Polymerkette untersucht. Der Knäuel-Kügelchen Übergang zeigt sich als ein kontinuierlicher Übergang, bei dem der Knäuel zum Kügelchen zusammenfällt. Der Kügelchen-Kügelchen Übergang bei niedrigeren Temperaturen ist ein Phasenübergang der ersten Ordnung, mit einer Koexistenz des flüssigen und festen Kügelchens, das eine kristalline Struktur hat. Im thermodynamischen Limes sind die Übergangstemperaturen identisch. Das entspricht einem Verschwinden der flüssigen Phase. In zwei Dimensionen zeigt das Modell einen kontinuierlichen Knäuel-Kügelchen Übergang mit einer lokal geordneten Struktur. Wir haben ferner einen Polymermushroom, das ist eine verankerte Polymerkette, zwischen zwei repulsiven Wänden im Abstand D untersucht. Das Phasenverhalten der Polymerkette zeigt einen dimensionalen crossover. Sowohl die Verankerung als auch die Beschränkung fördern den Knäuel-Kügelchen Übergang, wobei es eine Symmetriebrechung gibt, da die Ausdehnung der Polymerkette parallel zu den Wänden schneller schrumpft als die senkrecht zu den Wänden. Die Beschränkung hindert den Kügelchen-Kügelchen Übergang, wobei die Verankerung keinen Einfluss zu haben scheint. Die Übergangstemperaturen im thermodynamischen Limes sind wiederum identisch im Rahmen des Fehlers. Die spezifische Wärme des gleichen Modells aber mit einem abstoßendem Kastenpotential zeigt eine Schottky Anomalie, typisch für ein Zwei-Niveau System.