18 resultados para Tool path computing
Resumo:
The milling of thin parts is a high added value operation where the machinist has to face the chatter problem. The study of the stability of these operations is a complex task due to the changing modal parameters as the part loses mass during the machining and the complex shape of the tools that are used. The present work proposes a methodology for chatter avoidance in the milling of flexible thin floors with a bull-nose end mill. First, a stability model for the milling of compliant systems in the tool axis direction with bull-nose end mills is presented. The contribution is the averaging method used to be able to use a linear model to predict the stability of the operation. Then, the procedure for the calculation of stability diagrams for the milling of thin floors is presented. The method is based on the estimation of the modal parameters of the part and the corresponding stability lobes during the machining. As in thin floor milling the depth of cut is already defined by the floor thickness previous to milling, the use of stability diagrams that relate the tool position along the tool-path with the spindle speed is proposed. Hence, the sequence of spindle speeds that the tool must have during the milling can be selected. Finally, this methodology has been validated by means of experimental tests.
Resumo:
Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on "on-demand payment" for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS) solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: To ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible. Copyright: © 2015 Bildosola et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Resumo:
A number of European countries, among which the UK and Spain, have opened up their Directory Enquiry Services (DQs, or 118AB) market to competition. We analyse the Spanish case, where both local and foreign firms challenged the incumbent as of April 2003. We argue that the incumbent had the ability to abuse its dominant position, and that it was a perfectly rational strategy. In short,the incumbent raised its rivals' costs directly by providing an inferior quality version of the (essential) input, namely the incumbent's subscribers' database. We illustrate how it is possible to quantify the effect of abuse in situation were the entrant has no previous history in the market. To do this, we use the UK experience to construct the relevant counterfactual, that is the "but for abuse" scenario. After controlling for relative prices and advertising intensity, we find that one of the foreign entrants achieved a Spanish market share of only half of what it would have been in the absence of abuse.
Resumo:
10 p.
Resumo:
[EN] The purpose of this paper is to present a theoretical overview of innovation management and the tools that can aid in this endeavour. The paper adopts a user-oriented description, aiming at making SMEs familiar with the possibilities opened by innovation management tools in general and technology outlook in particular.
Resumo:
This paper proposes a new method for local key and chord estimation from audio signals. This method relies primarily on principles from music theory, and does not require any training on a corpus of labelled audio files. A harmonic content of the musical piece is first extracted by computing a set of chroma vectors. A set of chord/key pairs is selected for every frame by correlation with fixed chord and key templates. An acyclic harmonic graph is constructed with these pairs as vertices, using a musical distance to weigh its edges. Finally, the sequences of chords and keys are obtained by finding the best path in the graph using dynamic programming. The proposed method allows a mutual chord and key estimation. It is evaluated on a corpus composed of Beatles songs for both the local key estimation and chord recognition tasks, as well as a larger corpus composed of songs taken from the Billboard dataset.
Resumo:
Background Protein inference from peptide identifications in shotgun proteomics must deal with ambiguities that arise due to the presence of peptides shared between different proteins, which is common in higher eukaryotes. Recently data independent acquisition (DIA) approaches have emerged as an alternative to the traditional data dependent acquisition (DDA) in shotgun proteomics experiments. MSE is the term used to name one of the DIA approaches used in QTOF instruments. MSE data require specialized software to process acquired spectra and to perform peptide and protein identifications. However the software available at the moment does not group the identified proteins in a transparent way by taking into account peptide evidence categories. Furthermore the inspection, comparison and report of the obtained results require tedious manual intervention. Here we report a software tool to address these limitations for MSE data. Results In this paper we present PAnalyzer, a software tool focused on the protein inference process of shotgun proteomics. Our approach considers all the identified proteins and groups them when necessary indicating their confidence using different evidence categories. PAnalyzer can read protein identification files in the XML output format of the ProteinLynx Global Server (PLGS) software provided by Waters Corporation for their MSE data, and also in the mzIdentML format recently standardized by HUPO-PSI. Multiple files can also be read simultaneously and are considered as technical replicates. Results are saved to CSV, HTML and mzIdentML (in the case of a single mzIdentML input file) files. An MSE analysis of a real sample is presented to compare the results of PAnalyzer and ProteinLynx Global Server. Conclusions We present a software tool to deal with the ambiguities that arise in the protein inference process. Key contributions are support for MSE data analysis by ProteinLynx Global Server and technical replicates integration. PAnalyzer is an easy to use multiplatform and free software tool.
Resumo:
This paper analyzes the use of artificial neural networks (ANNs) for predicting the received power/path loss in both outdoor and indoor links. The approach followed has been a combined use of ANNs and ray-tracing, the latter allowing the identification and parameterization of the so-called dominant path. A complete description of the process for creating and training an ANN-based model is presented with special emphasis on the training process. More specifically, we will be discussing various techniques to arrive at valid predictions focusing on an optimum selection of the training set. A quantitative analysis based on results from two narrowband measurement campaigns, one outdoors and the other indoors, is also presented.
Resumo:
There is an increasing number of Ambient Intelligence (AmI) systems that are time-sensitive and resource-aware. From healthcare to building and even home/office automation, it is now common to find systems combining interactive and sensing multimedia traffic with relatively simple sensors and actuators (door locks, presence detectors, RFIDs, HVAC, information panels, etc.). Many of these are today known as Cyber-Physical Systems (CPS). Quite frequently, these systems must be capable of (1) prioritizing different traffic flows (process data, alarms, non-critical data, etc.), (2) synchronizing actions in several distributed devices and, to certain degree, (3) easing resource management (e.g., detecting faulty nodes, managing battery levels, handling overloads, etc.). This work presents FTT-MA, a high-level middleware architecture aimed at easing the design, deployment and operation of such AmI systems. FTT-MA ensures that both functional and non-functional aspects of the applications are met even during reconfiguration stages. The paper also proposes a methodology, together with a design tool, to create this kind of systems. Finally, a sample case study is presented that illustrates the use of the middleware and the methodology proposed in the paper.
Resumo:
We study quantum state tomography, entanglement detection and channel noise reconstruction of propagating quantum microwaves via dual-path methods. The presented schemes make use of the following key elements: propagation channels, beam splitters, linear amplifiers and field quadrature detectors. Remarkably, our methods are tolerant to the ubiquitous noise added to the signals by phase-insensitive microwave amplifiers. Furthermore, we analyse our techniques with numerical examples and experimental data, and compare them with the scheme developed in Eichler et al (2011 Phys. Rev. Lett. 106 220503; 2011 Phys. Rev. Lett. 107 113601), based on a single path. Our methods provide key toolbox components that may pave the way towards quantum microwave teleportation and communication protocols.
Resumo:
142 p.
Resumo:
[EN] The purpose of this review article is to illustrate synthetic aspects of functionalized phosphorus derivatives containing an oximo moiety at the beta-position. First section will be focused on the synthesis of phosphine oxides, phosphonates or phosphonium salts containing an oxime group. The synthesis of these derivatives comprises the carbon–phosphorus single bond construction by reaction of haloximes with phosphorus derivatives, nucleophilic addition of phosphorus reagents to carbonyl compounds, or nucleophilic addition of phosphorus reagents to nitro olefins. This section will also concentrate on the most practical routes for the synthesis of the target compounds, through carbon–nitrogen double bond formation, which are as follows: condensation processes of carbonyl compounds and hydroxylamine derivatives or addition of hydroxylamines to allenes or alkynes. The preparative use of beta-oximo phosphorus derivatives as synthetic intermediates will be discussed in a second section, comprising olefination reaction, oxidation of oximes to nitrile oxides by reaction at the C-N double bond of the oxime moiety, oxidation of these substrates to nitrosoalkenes, reduction to the corresponding hydroxylamines and some reactions at the hydroxyl group of the hydroxyimino moiety.
Resumo:
167 p.
Resumo:
Resumen (ESPAÑOL) En el presente trabajo se ha tratado de analizar la evolución, tanto del concepto de Desarrollo Sostenible, como de la herramienta Agenda 21 Local, verificando la creciente preocupación que se está extendiendo en la ciudadanía, en las instituciones y en las empresas, así como la irreversibilidad de la mayoría de los resultados alcanzados. En este marco, se han estudiado los casos concretos de la implantación de la Agenda 21 Local en Europa, España, Euskadi y Getxo, por ser este último el municipio en el que resido. Con este estudio, se confirma la hipótesis de que, aunque en la implantación de la Agenda 21 Local se aprecia un esfuerzo realizado notable, queda todavía mucho camino por recorrer en la senda hacia la sostenibilidad.
Resumo:
One of the most challenging problems in mobile broadband networks is how to assign the available radio resources among the different mobile users. Traditionally, research proposals are either speci c to some type of traffic or deal with computationally intensive algorithms aimed at optimizing the delivery of general purpose traffic. Consequently, commercial networks do not incorporate these mechanisms due to the limited hardware resources at the mobile edge. Emerging 5G architectures introduce cloud computing principles to add flexible computational resources to Radio Access Networks. This paper makes use of the Mobile Edge Computing concepts to introduce a new element, denoted as Mobile Edge Scheduler, aimed at minimizing the mean delay of general traffic flows in the LTE downlink. This element runs close to the eNodeB element and implements a novel flow-aware and channel-aware scheduling policy in order to accommodate the transmissions to the available channel quality of end users.