14 resultados para Offline programing
em Repositório Institucional UNESP - Universidade Estadual Paulista "Julio de Mesquita Filho"
Resumo:
Grinding is a finishing process in machining operations, and the topology of the grinding tool is responsible for producing the desired result on the surface of the machined material The tool topology is modeled in the dressing process and precision is therefore extremely important This study presents a solution in the monitoring of the dressing process, using a digital signal processor (DSP) operating in real time to detect the optimal dressing moment To confirm the monitoring efficiency by DSP, the results were compared with those of a data acquisition system (DAQ) and offline processing The method employed here consisted of analyzing the acoustic emission and electrical power signal by applying the DPO and DPKS parameters The analysis of the results allowed us to conclude that the application of the DPO and DPKS parameters can be substituted by processing of the mean acoustic emission signal, thus reducing the computational effort
Resumo:
The authors present an offline switching power supply with multiple isolated outputs and unity power factor with the use of only one power processing stage, based on the DC-DC SEPIC (single ended primary inductance converter) modulated by variable hysteresis current control. The principle of operation, the theoretical analysis, the design procedure, an example, and simulation results are presented. A laboratory prototype, rated at 160 W, operating at a maximum switching frequency of 100 kHz, with isolated outputs rated at +5 V/15 A -5 V/1 A, +12 V/6 A and -12 V/1 A, has been built given an input power factor near unity.
Resumo:
Individual data of basic density and volume of wood, pulp kappa number, soluble lignin, cost of pulping process as well as gravimetric gross yield of pulping process were used from 64 trees of Eucalyptus grandis W. Hill ex Maiden from a commercial population at Lençóis Paulista, SP. The Eucalyptus grandis's seeds were originally from a Seed Production Area (SPA) of Duratex S/A at Botucatu, SP. Obtained data were quantified considering objective of maximization no-bleaching pulp and volume and mass wood restriction, mass of residual and soluble lignin, planted area and pulping process cost. it has also been aimed a selection method for matrix trees through mathematical programming techniques. Obtained strategy maximized the economical result, selected matrix trees and followed all limits of technological and organizing productivities imposed by the company. It also aimed the production of no-bleaching pulp within the planned time.
Resumo:
The CMS High-Level Trigger (HLT) is responsible for ensuring that data samples with potentially interesting events are recorded with high efficiency and good quality. This paper gives an overview of the HLT and focuses on its commissioning using cosmic rays. The selection of triggers that were deployed is presented and the online grouping of triggered events into streams and primary datasets is discussed. Tools for online and offline data quality monitoring for the HLT are described, and the operational performance of the muon HLT algorithms is reviewed. The average time taken for the HLT selection and its dependence on detector and operating conditions are presented. The HLT performed reliably and helped provide a large dataset. This dataset has proven to be invaluable for understanding the performance of the trigger and the CMS experiment as a whole. © 2010 IOP Publishing Ltd and SISSA.
Resumo:
The CMS Collaboration conducted a month-long data taking exercise, the Cosmic Run At Four Tesla, during October-November 2008, with the goal of commissioning the experiment for extended operation. With all installed detector systems participating, CMS recorded 270 million cosmic ray events with the solenoid at a magnetic field strength of 3.8 T. This paper describes the data flow from the detector through the various online and offline computing systems, as well as the workflows used for recording the data, for aligning and calibrating the detector, and for analysis of the data. © 2010 IOP Publishing Ltd and SISSA.
Resumo:
Commissioning studies of the CMS hadron calorimeter have identified sporadic uncharacteristic noise and a small number of malfunctioning calorimeter channels. Algorithms have been developed to identify and address these problems in the data. The methods have been tested on cosmic ray muon data, calorimeter noise data, and single beam data collected with CMS in 2008. The noise rejection algorithms can be applied to LHC collision data at the trigger level or in the offline analysis. The application of the algorithms at the trigger level is shown to remove 90% of noise events with fake missing transverse energy above 100 GeV, which is sufficient for the CMS physics trigger operation. © 2010 IOP Publishing Ltd and SISSA.
Resumo:
In this paper we would like to shed light the problem of efficiency and effectiveness of image classification in large datasets. As the amount of data to be processed and further classified has increased in the last years, there is a need for faster and more precise pattern recognition algorithms in order to perform online and offline training and classification procedures. We deal here with the problem of moist area classification in radar image in a fast manner. Experimental results using Optimum-Path Forest and its training set pruning algorithm also provided and discussed. © 2011 IEEE.
Resumo:
Objective: To investigate the influence of the convergence angle of tooth preparation on the fracture load of Y-TZP-based ceramic (YZ-Vita YZ) substructure (SB) veneered with a feldspathic porcelain (VM9-Vita VM9). Methods: Finite element stress analysis (FEA) was performed to examine the stress distribution of the system. Eighty YZ SB were fabricated using a CAD-CAM system and divided into four groups (n = 20), according to the total occlusal convergence (TOC) angle: G6-6° TOC; G12-12° TOC; G20-20° TOC; and G20MOD-20° TOC with modified SB. All SB were veneered with VM9, cemented in a fiber reinforced epoxy resin die, and loaded to failure. Half of the specimens from each group (n = 10) were cyclic fatigued (106 cycles) before testing. Failure analysis was performed to determine the fracture origin. Data were statistically analyzed using Anova and Tukey's tests (α = 0.05). Results: The greatest mean load to fracture value was found for the G20MOD, which was predicted by the FEA. Cyclic fatigue did not significantly affect the load of fracture. Catastrophic failure originating from the internal occlusal surface of the SB was the predominant failure mode, except for G20MOD. Significance: The YZ-VM9 restorations resisted greater compression load than the usual physiological occlusal load, regardless of the TOC angle of preparations. Yet, the G20MOD design produced the best performance among the experimental conditions evaluated. © 2013 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Introduction: Organizations are expending more and more, the spaces electronic / digital (Internet / intranet / extranet) as a way to efficiently manage information and knowledge in the organizational environment. The management of inputs informational and intellectual belongings ranges from the strategic level to the operational level; the results demonstrate the strength of the socialization of organizational strategies. Objective: To reflect on the role of information architecture for the development of electronic spaces / digital in organizational environments. Methodology: Analytical study supported by specialized literature, based on three aspects emphasized by Morville and Rosenfeld (2006) and applied to information architecture: context, content and user studies beyond the search and use of information Choo (2006) which also highlights three aspects: situational dimensions, cognitive needs and emotional reactions. Results: In the context of the Web environment organizations have a large number of sites for brands / products that have mostly no organizational structure or shared navigation. The results show that when a department needs to contact another department must do so in order offline. Conclusion: The information architecture has become essential for the development of management information systems that makes possible to easily find and access data and information, as well as helps in developing distinct hierarchies to structure the distribution of content, promoting developed quality and effectiveness of the management systems.
Análise estrutural de treliças espaciais no software Excel utilizando o médodo dos elementos finitos
Resumo:
The following paper means to develop a program to make structural analysis of space trusses. The program to be implemented was based on the concepts of the finite element method and used the programing resources of Visual Basic for Applications (VBA) for the Excel Software®. Being Excel® a software of easy access, low cost, capacity to make matrix calculations and with advanced resources of VBA programing, it is possible to develop an economic solution, efficient and precise for structural analysis of space trusses. Firstly is presented a finite elemento method and the space truss. Then is developed a few important algorithms to be used during the development of the program and also the use of a few resources of VBA. And to validate the quality, efficiency and precision of the results, these are compared with the established commercial software Ansys
Resumo:
Pós-graduação em Engenharia Elétrica - FEIS
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Comunicação empresarial e relações públicas: a construção de relacionamentos no contexto do Facebook
Resumo:
Currently with the context of the Internet, especially on online social networks, it is observed that people and companies are establishing relationships both online and offline. The new digital culture and the competitiveness of the market are making companies change the way they communicate with their customers, providing a broader and more participative communication with their stakeholders. This resear chaims to understand the relation of corporate communication and Public Relations and from the conceptualization of the trials will detect the relationship processes of companies and their public in online social networks. Understand this type of relation ship is interesting because that, it is the field of the Public Relations professional. An area that grows in importance in today's market, since Facebook is being used as a channel of communication and relationship with consumers. The methodology used is exploratory research, in which, it aims to familiarize themselves with the subject and the possibility of analyzing three examples of relationships and engagement in the corporate communications field. In this case, the ranking of the top three companies, Coca-Cola, Guaraná Antarctica e Mc Donald's, in the Top Facebook Posts Brazil in January 2015. It is important that to have a Public Relations Professional that is able to create an open channel of communication on social networks with the purpose of detecting the characteristics of the target public and promote the participation of such public in the building of content and the innovative organization process. Therefore, it becomes an engaging and everlasting relationship