990 resultados para technical applications
Resumo:
In this paper we describe the development of a low-cost high-accuracy Galileo Code receiver, user application software and positioning algorithms for land management applications, which have been implemented using a dedicated FPGA board and dual frequency Galileo E5/L1 Radio Frequency Front-End. The current situation of rural property surveying in Brazil is described and the use of code measurements from the new Galileo signals E5 AltBOC combined with E1 MBOC for use in land management applications is explored. We explain how such approach is expected to allow delivering an absolute positioning solution which could bridge the gap between receivers of high cost/complexity/accuracy based on carrier phase and receivers of lower cost/accuracy based on pseudorange observables. The system is presented together with a detailed description of main components: the Code Receiver and the Application Software. The work presented is part of an ongoing European-Brazilian consortium effort to explore the use of new Galileo for land management applications in Brazil and sponsored by the GNSS Supervisory Authority (GSA).
Resumo:
This work has as objectives the implementation of a intelligent computational tool to identify the non-technical losses and to select its most relevant features, considering information from the database with industrial consumers profiles of a power company. The solution to this problem is not trivial and not of regional character, the minimization of non-technical loss represents the guarantee of investments in product quality and maintenance of power systems, introduced by a competitive environment after the period of privatization in the national scene. This work presents using the WEKA software to the proposed objective, comparing various classification techniques and optimization through intelligent algorithms, this way, can be possible to automate applications on Smart Grids. © 2012 IEEE.
Resumo:
The non-technical loss is not a problem with trivial solution or regional character and its minimization represents the guarantee of investments in product quality and maintenance of power systems, introduced by a competitive environment after the period of privatization in the national scene. In this paper, we show how to improve the training phase of a neural network-based classifier using a recently proposed meta-heuristic technique called Charged System Search, which is based on the interactions between electrically charged particles. The experiments were carried out in the context of non-technical loss in power distribution systems in a dataset obtained from a Brazilian electrical power company, and have demonstrated the robustness of the proposed technique against with several others nature-inspired optimization techniques for training neural networks. Thus, it is possible to improve some applications on Smart Grids. © 2013 IEEE.
Resumo:
The objective of this work was to evaluate the effect of acaricide applications and pruning of symptomatic branches in citrus leprosis management in Brazil. It was conducted in an orange plantation of the 'Pera' variety, grafted onto the 'Cleopatra' tangerine, in two seasons (2006-2007 and 2007-2008). The experimental design was randomized blocks in a factorial scheme consisting of the following factors: (A) acaricide, in three levels: spirodiclofen and cyhexatin applied in rotation, lime sulphur; no acaricides; (B) pruning to remove branches that showed symptoms of leprosis, with two levels: with pruning, without pruning. We carried out periodic assessments of Brevipalpus phoenicis (Geijskes) populations (vector of the leprosis virus), leprosis incidence and severity, fruit yield, and the economic feasibility of the applied strategies. Based on the results, we concluded that spirodiclofen and cyhexatin were more effective than lime sulphur in B. phoenicis control. Control with lime sulphur required more applications than spirodiclofen and cyhexatin in rotation, making it more expensive. Pruning of symptomatic branches used in isolation was not sufficiently effective to control leprosis and significantly increased control costs. Profits were higher when the control involved sprayings of spirodiclofen and cyhexatin in alternation, with or without pruning.
Resumo:
PREPARATION OF COATED MICROTOOLS FOR ELECTROCHEMICAL MACHINING APPLICATIONS Ajaya K. Swain, M.S. University of Nebraska, 2010 Advisor: K.P. Rajurkar Coated tools have improved the performance of both traditional and nontraditional machining processes and have resulted in higher material removal, better surface finish, and increased wear resistance. However, a study on the performance of coated tools in micromachining has not yet been adequately conducted. One possible reason is the difficulties associated with the preparation of coated microtools. Besides the technical requirement, economic and environmental aspects of the material and the coating technique used also play a significant role in coating microtools. This, in fact, restricts the range of coating materials and the type of coating process. Handling is another major issue in case of microtools purely because of their miniature size. This research focuses on the preparation of coated microtools for pulse electrochemical machining by electrodeposition. The motivation of this research is derived from the fact that although there were reports of improved machining by using insulating coatings on ECM tools, particularly in ECM drilling operations, not much literature was found relating to use of metallic coating materials in other ECM process types. An ideal ECM tool should be good thermal and electrical conductor, corrosion resistant, electrochemically stable, and stiff enough to withstand electrolyte pressure. Tungsten has almost all the properties desired in an ECM tool material except being electrochemically unstable. Tungsten can be oxidized during machining resulting in poor machining quality. Electrochemical stability of a tungsten ECM tool can be improved by electroplating it with nickel which has superior electrochemical resistance. Moreover, a tungsten tool can be coated in situ reducing the tool handling and breakage frequency. The tungsten microtool was electroplated with nickel with direct and pulse current. The effect of the various input parameters on the coating characteristics was studied and performance of the coated microtool was evaluated in pulse ECM. The coated tool removed more material (about 28%) than the uncoated tool under similar conditions and was more electrochemical stable. It was concluded that nickel coated tungsten microtool can improve the pulse ECM performance.
Resumo:
Technical evaluation of analytical data is of extreme relevance considering it can be used for comparisons with environmental quality standards and decision-making as related to the management of disposal of dredged sediments and the evaluation of salt and brackish water quality in accordance with CONAMA 357/05 Resolution. It is, therefore, essential that the project manager discusses the environmental agency`s technical requirements with the laboratory contracted for the follow-up of the analysis underway and even with a view to possible re-analysis when anomalous data are identified. The main technical requirements are: (1) method quantitation limits (QLs) should fall below environmental standards; (2) analyses should be carried out in laboratories whose analytical scope is accredited by the National Institute of Metrology (INMETRO) or qualified or accepted by a licensing agency; (3) chain of custody should be provided in order to ensure sample traceability; (4) control charts should be provided to prove method performance; (5) certified reference material analysis or, if that is not available, matrix spike analysis, should be undertaken and (6) chromatograms should be included in the analytical report. Within this context and with a view to helping environmental managers in analytical report evaluation, this work has as objectives the discussion of the limitations of the application of SW 846 US EPA methods to marine samples, the consequences of having data based on method detection limits (MDL) and not sample quantitation limits (SQL), and present possible modifications of the principal method applied by laboratories in order to comply with environmental quality standards.
Resumo:
In my PhD thesis I propose a Bayesian nonparametric estimation method for structural econometric models where the functional parameter of interest describes the economic agent's behavior. The structural parameter is characterized as the solution of a functional equation, or by using more technical words, as the solution of an inverse problem that can be either ill-posed or well-posed. From a Bayesian point of view, the parameter of interest is a random function and the solution to the inference problem is the posterior distribution of this parameter. A regular version of the posterior distribution in functional spaces is characterized. However, the infinite dimension of the considered spaces causes a problem of non continuity of the solution and then a problem of inconsistency, from a frequentist point of view, of the posterior distribution (i.e. problem of ill-posedness). The contribution of this essay is to propose new methods to deal with this problem of ill-posedness. The first one consists in adopting a Tikhonov regularization scheme in the construction of the posterior distribution so that I end up with a new object that I call regularized posterior distribution and that I guess it is solution of the inverse problem. The second approach consists in specifying a prior distribution on the parameter of interest of the g-prior type. Then, I detect a class of models for which the prior distribution is able to correct for the ill-posedness also in infinite dimensional problems. I study asymptotic properties of these proposed solutions and I prove that, under some regularity condition satisfied by the true value of the parameter of interest, they are consistent in a "frequentist" sense. Once I have set the general theory, I apply my bayesian nonparametric methodology to different estimation problems. First, I apply this estimator to deconvolution and to hazard rate, density and regression estimation. Then, I consider the estimation of an Instrumental Regression that is useful in micro-econometrics when we have to deal with problems of endogeneity. Finally, I develop an application in finance: I get the bayesian estimator for the equilibrium asset pricing functional by using the Euler equation defined in the Lucas'(1978) tree-type models.
Resumo:
A recent initiative of the European Space Agency (ESA) aims at the definition and adoption of a software reference architecture for use in on-board software of future space missions. Our PhD project placed in the context of that effort. At the outset of our work we gathered all the industrial needs relevant to ESA and all the main European space stakeholders and we were able to consolidate a set of technical high-level requirements for the fulfillment of them. The conclusion we reached from that phase confirmed that the adoption of a software reference architecture was indeed the best solution for the fulfillment of the high-level requirements. The software reference architecture we set on building rests on four constituents: (i) a component model, to design the software as a composition of individually verifiable and reusable software units; (ii) a computational model, to ensure that the architectural description of the software is statically analyzable; (iii) a programming model, to ensure that the implementation of the design entities conforms with the semantics, the assumptions and the constraints of the computational model; (iv) a conforming execution platform, to actively preserve at run time the properties asserted by static analysis. The nature, feasibility and fitness of constituents (ii), (iii) and (iv), were already proved by the author in an international project that preceded the commencement of the PhD work. The core of the PhD project was therefore centered on the design and prototype implementation of constituent (i), a component model. Our proposed component model is centered on: (i) rigorous separation of concerns, achieved with the support for design views and by careful allocation of concerns to the dedicated software entities; (ii) the support for specification and model-based analysis of extra-functional properties; (iii) the inclusion space-specific concerns.
Resumo:
The thesis presents a probabilistic approach to the theory of semigroups of operators, with particular attention to the Markov and Feller semigroups. The first goal of this work is the proof of the fundamental Feynman-Kac formula, which gives the solution of certain parabolic Cauchy problems, in terms of the expected value of the initial condition computed at the associated stochastic diffusion processes. The second target is the characterization of the principal eigenvalue of the generator of a semigroup with Markov transition probability function and of second order elliptic operators with real coefficients not necessarily self-adjoint. The thesis is divided into three chapters. In the first chapter we study the Brownian motion and some of its main properties, the stochastic processes, the stochastic integral and the Itô formula in order to finally arrive, in the last section, at the proof of the Feynman-Kac formula. The second chapter is devoted to the probabilistic approach to the semigroups theory and it is here that we introduce Markov and Feller semigroups. Special emphasis is given to the Feller semigroup associated with the Brownian motion. The third and last chapter is divided into two sections. In the first one we present the abstract characterization of the principal eigenvalue of the infinitesimal generator of a semigroup of operators acting on continuous functions over a compact metric space. In the second section this approach is used to study the principal eigenvalue of elliptic partial differential operators with real coefficients. At the end, in the appendix, we gather some of the technical results used in the thesis in more details. Appendix A is devoted to the Sion minimax theorem, while in appendix B we prove the Chernoff product formula for not necessarily self-adjoint operators.
Resumo:
Extracranial application of diffusion-weighted magnetic resonance imaging (MRI) has gained increasing importance in recent years. As a result of technical advances, this new non-invasive functional technique has also been applied in head and neck radiology for several clinical indications. In cancer imaging, diffusion-weighted MRI can be performed for tumour detection and characterization, monitoring of treatment response as well as the differentiation of recurrence and post-therapeutic changes after radiotherapy. Even for lymph node staging promising results have been reported recently. This review article provides overview of potential applications of diffusion-weighted MRI in head and neck with the main focus on its applications in oncology.
Resumo:
Diffusion-weighted MRI has become more and more popular in the last couple of years. It is already an accepted diagnostic tool for patients with acute stroke, but is more difficult to use for extracranial applications due to technical challenges mostly related to motion sensitivity and susceptibility variations (e.g., respiration and air-tissue boundaries). However, thanks to the newer technical developments, applications of body DW-MRI are starting to emerge. In this review, we aim to provide an overview of the current status of the published data on DW-MRI in extracranial applications. A short introduction to the physical background of this promising technique is provided, followed by the current status, subdivided into three main topics, the functional evaluation, tissue characterization and therapy monitoring.
Resumo:
Proteomics is fulfilling its potential and beginning to impact the diagnosis and therapy of cardiovascular disease. As de novo proteomics analysis gets more streamlined, and robust high-throughput methods are developed, more and more attention is being directed toward the field of cardiovascular serum and plasma biomarker discovery. To take cardiovascular proteomics from bench to bedside, great care must be taken to achieve reproducible results. Despite technical advances, however, the absolute number of clinical biomarkers thus far discovered by a proteomics approach is small. Although several factors contribute to this lack, one step is to build "translation teams" involving a close collaboration between researchers and clinicians.
Resumo:
Two technical solutions using single or dual shot offer different advantages and disadvantages for dual energy subtraction. The principles of these are explained and the main clinical applications with results are demonstrated. Elimination of overlaying bone and proof or exclusion of calcification are the primary aims of energy subtraction chest radiography, offering unique information in different clinical situations.
Resumo:
Eine zunehmende Anzahl von Artikeln in Publikumszeitschriften und Journalen rückt die direkte Herstellung von Bauteilen und Figuren immer mehr in das Bewusstsein einer breiten Öffentlichkeit. Leider ergibt sich nur selten ein einigermaßen vollständiges Bild davon, wie und in welchen Lebensbereichen diese Techniken unseren Alltag verändern werden. Das liegt auch daran, dass die meisten Artikel sehr technisch geprägt sind und sich nur punktuell auf Beispiele stützen. Dieser Beitrag geht von den Bedürfnissen der Menschen aus, wie sie z.B. in der Maslow’schen Bedürfnispyramide strukturiert dargestellt sind und unterstreicht dadurch, dass 3D Printing (oder Additive Manufacturing resp. Rapid Prototyping) bereits alle Lebensbereiche erfasst hat und im Begriff ist, viele davon zu revolutionieren.
Resumo:
Sport-motor tests play an important role in football talent selections. However, single tests represent only parts of the complex game performance. The best game performance therefore does not necessarily need to go hand in hand with the best results in all tests of a test battery. Considering the complexity of the game performance appropriately, a holistic perspective together with a person-oriented approach are applied. Thereby, systems consisting of several variables are identified and analysed in a longitudinal study. Following this idea, six sport-motor tests were aggregated into a subsystem. 106 young male elite football players were tested three times (2011, 2012, 2013; Mage, t2011=12.26, SD=0.29). One year later (2014) their performance level was enquired. Data were analysed using the LICUR method, a cluster analytical method. Four patterns were identified, which remained stable at all measuring points. The players frequently show intraindividual and structurally similar patterns over time. At the third measuring point, a pattern occurred out of which the players are significantly more likely to advance to the highest performance level one year later. This pattern appears consistently above average, but does not always show best test performances. The significantly frequent development along structurally stable patterns suggests a predictive validity of the subsystem sport-motor tests between the ages of 12 to 15. Above average, but not necessarily outstanding performances both in the motor abilities as well as in the football specific tests appears to be particularly promising. This finding emphasizes the need of a holistic perspective in the talent selection.