978 resultados para Must -- Analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Direct torque control (DTC) is a new control method for rotating field electrical machines. DTC controls directly the motor stator flux linkage with the stator voltage, and no stator current controllers are used. With the DTC method very good torque dynamics can be achieved. Until now, DTC has been applied to asynchronous motor drives. The purpose of this work is to analyse the applicability of DTC to electrically excited synchronous motor drives. Compared with asynchronous motor drives, electrically excited synchronous motor drives require an additional control for the rotor field current. The field current control is called excitation control in this study. The dependence of the static and dynamic performance of DTC synchronous motor drives on the excitation control has been analysed and a straightforward excitation control method has been developed and tested. In the field weakening range the stator flux linkage modulus must be reduced in order to keep the electro motive force of the synchronous motor smaller than the stator voltage and in order to maintain a sufficient voltage reserve. The dynamic performance of the DTC synchronous motor drive depends on the stator flux linkage modulus. Another important factor for the dynamic performance in the field weakening range is the excitation control. The field weakening analysis considers both dependencies. A modified excitation control method, which maximises the dynamic performance in the field weakening range, has been developed. In synchronous motor drives the load angle must be kept in a stabile working area in order to avoid loss of synchronism. The traditional vector control methods allow to adjust the load angle of the synchronous motor directly by the stator current control. In the DTC synchronous motor drive the load angle is not a directly controllable variable, but it is formed freely according to the motor’s electromagnetic state and load. The load angle can be limited indirectly by limiting the torque reference. This method is however parameter sensitive and requires a safety margin between the theoretical torque maximum and the actual torque limit. The DTC modulation principle allows however a direct load angle adjustment without any current control. In this work a direct load angle control method has been developed. The method keeps the drive stabile and allows the maximal utilisation of the drive without a safety margin in the torque limitation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACT The climate change, the quest for sustainability and the strong environmental pressures for alternatives to traditional fossil fuels have increased the interest in the search and use of renewable energy sources. Among them stands out the biomass of charcoal coming from renewable forests, widely used as a thermal reductant in the steel industry in the detriment of the use of mineral coal coke. This study aimed to compare different operating procedures of immediate chemical analysis of charcoal. Seven essays to immediate chemical analysis were compared, spread between procedures performed by Brazilian companies and laboratories, the test described by NBR 8112 and one realized with a thermogravimetric analyzer (TGA) using the parameters of the NBR 8112. There were significant differences in the volatiles matter content and consequently in the fixed carbon contents found. The differences between the procedures and the NBR 8112 were caused by an excess burning time, a mass sample above or below the standard or inappropriate container used for burning. It observed that the TGA appraisal of the volatiles content must be carried out with a burning time equal to 2 minutes to obtain results similar to those of the NBR 8112 norm. Moreover, the ash content values were statistically identical and the particles size did not influence the differences between means.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays, dropping out in B.Sc. courses practically occurs in all Universities of the contemporary world. Undergraduate student withdraw could means several losses as, to the student, not to graduate, to the teacher, for not accomplishing his goal as educator, to the university, for not attending its mission, to the society, economic and social losses and also to the family for unfulfilling the dreams. The objective of this research is to present a quantitative study on the dropping out rate in the Agricultural Engineering B.Sc. program (BSAGENG) at State University of Campinas (UNICAMP), seeking to contribute to the understanding of this issue. It has been determined the dropping out rate from 1995 to 2006 based on the university official data, by employing four different methods of calculation. Three of the methods revealed that dropping out rate is very close to the graduation index, i.e., close to 50%. Regardless of the adopted method for the dropping rate estimation and the statistics demonstrating that the agricultural engineering undergraduate course at UNICAMP figures falls within similar courses normality in Brazil, it should be recognized that a public institution of education should be concerned in presenting such figures. A detailed and deep analysis must be outlined in further studies seeking for specific actions aiming to reduce dropping out process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study aimed to verify the influence of partial dehydration of "Niagara Rosada" grape clusters in physicochemical quality of the pre- fermentation must. In Brazil, during the winemaking process it is common to need to adjust the grape must when the physicochemical characteristics of the raw material are insufficient to produce wines in accordance with the Brazilian legislation for classification of beverages, which establishes the minimum alcohol content of 8.6 % for the beverage to be considered wine. Therefore, given that the reduction in the water content of grape berries allows the concentration of chemical compounds present in its composition, especially the concentration of total soluble solids, we proceeded with the treatments that were formed by the combination of two temperatures (T1-37.1ºC and T2-22.9 ºC) two air speeds (S1: 1.79 m s-1 and S2: 3.21 m s-1) and a control (T0) that has not gone through the dehydration treatment. Analysis of pH, Total Titratable Acidity (TTA) were performed in mEq L-1, Total Soluble Solids (TSS) in ºBrix, water content on a dry basis and Concentration of Phenolic Compounds (CPC) in mg of gallic acid per 100g of must. The average comparison test identified statistically significant modifications for the adaptation of must for winemaking purposes, having the treatment with 22.9 ºC and air speed of 1.79 m s-1 shown the largest increase in the concentration of total soluble solids, followed by the second best result for concentration of phenolic compounds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Local head losses must be considered in estimating properly the maximum length of drip irrigation laterals. The aim of this work was to develop a model based on dimensional analysis for calculating head loss along laterals accounting for in-line drippers. Several measurements were performed with 12 models of emitters to obtain the experimental data required for developing and assessing the model. Based on the Camargo & Sentelhas coefficient, the model presented an excellent result in terms of precision and accuracy on estimating head loss. The deviation between estimated and observed values of head loss increased according to the head loss and the maximum deviation reached 0.17 m. The maximum relative error was 33.75% and only 15% of the data set presented relative errors higher than 20%. Neglecting local head losses incurred a higher than estimated maximum lateral length of 19.48% for pressure-compensating drippers and 16.48% for non pressure-compensating drippers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis aims to find an effective way of conducting a target audience analysis (TAA) in cyber domain. There are two main focal points that are addressed; the nature of the cyber domain and the method of the TAA. Of the cyber domain the object is to find the opportunities, restrictions and caveats that result from its digital and temporal nature. This is the environment in which the TAA method is examined in this study. As the TAA is an important step of any psychological operation and critical to its success, the method used must cover all the main aspects affecting the choice of a proper target audience. The first part of the research was done by sending an open-ended questionnaire to operators in the field of information warfare both in Finland and abroad. As the results were inconclusive, the research was completed by assessing the applicability of United States Army Joint Publication FM 3-05.301 in the cyber domain via a theory-based content analysis. FM 3- 05.301 was chosen because it presents a complete method of the TAA process. The findings were tested against the results of the questionnaire and new scientific research in the field of psychology. The cyber domain was found to be “fast and vast”, volatile and uncontrollable. Although governed by laws to some extent, the cyber domain is unpredictable by nature and not controllable to reasonable amount. The anonymity and lack of verification often present in the digital channels mean that anyone can have an opinion, and any message sent may change or even be counterproductive to the original purpose. The TAA method of the FM 3-05.301 is applicable in the cyber domain, although some parts of the method are outdated and thus suggested to be updated if used in that environment. The target audience categories of step two of the process were replaced by new groups that exist in the digital environment. The accessibility assessment (step eight) was also redefined, as in the digital media the mere existence of a written text is typically not enough to convey the intended message to the target audience. The scientific studies made in computer sciences and both in psychology and sociology about the behavior of people in social media (and overall in cyber domain) call for a more extensive remake of the TAA process. This falls, however, out of the scope of this work. It is thus suggested that further research should be carried out in search of computer-assisted methods and a more thorough TAA process, utilizing the latest discoveries of human behavior. ---------------------------------------------------------------------------------------------------------------------------------- Tämän opinnäytetyön tavoitteena on löytää tehokas tapa kohdeyleisöanalyysin tekemiseksi kybertoimintaympäristössä. Työssä keskitytään kahteen ilmiöön: kybertoimintaympäristön luonteeseen ja kohdeyleisöanalyysin metodiin. Kybertoimintaympäristön osalta tavoitteena on löytää sen digitaalisesta ja ajallisesta luonteesta juontuvat mahdollisuudet, rajoitteet ja sudenkuopat. Tämä on se ympäristö jossa kohdeyleisöanalyysiä tarkastellaan tässä työssä. Koska kohdeyleisöanalyysi kuuluu olennaisena osana jokaiseen psykologiseen operaatioon ja on onnistumisen kannalta kriittinen tekijä, käytettävän metodin tulee pitää sisällään kaikki oikean kohdeyleisön valinnan kannalta merkittävät osa-alueet. Tutkimuksen ensimmäisessä vaiheessa lähetettiin avoin kysely informaatiosodankäynnin ammattilaisille Suomessa ja ulkomailla. Koska kyselyn tulokset eivät olleet riittäviä johtopäätösten tekemiseksi, tutkimusta jatkettiin tarkastelemalla Yhdysvaltojen armeijan kenttäohjesäännön FM 3-05.301 soveltuvuutta kybertoimintaympäristössä käytettäväksi teorialähtöisen sisällönanalyysin avulla. FM 3-05.301 valittiin koska se sisältää kokonaisvaltaisen kohdeyleisöanalyysiprosessin. Havaintoja verrattiin kyselytutkimuksen tuloksiin ja psykologian uusiin tutkimuksiin. Kybertoimintaympäristö on tulosten perusteella nopea ja valtava, jatkuvasti muuttuva ja kontrolloimaton. Vaikkakin lait hallitsevat kybertoimintaympäristöä jossakin määrin, on se silti luonteeltaan ennakoimaton eikä sitä voida luotettavasti hallita. Digitaalisilla kanavilla usein läsnäoleva nimettömyys ja tiedon tarkastamisen mahdottomuus tarkoittavat että kenellä tahansa voi olla mielipide asioista, ja mikä tahansa viesti voi muuttua, jopa alkuperäiseen tarkoitukseen nähden vastakkaiseksi. FM 3-05.301:n metodi toimii kybertoimintaympäristössä, vaikkakin jotkin osa-alueet ovat vanhentuneita ja siksi ne esitetään päivitettäväksi mikäli metodia käytetään kyseisessä ympäristössä. Kohdan kaksi kohdeyleisökategoriat korvattiin uusilla, digitaalisessa ympäristössä esiintyvillä ryhmillä. Lähestyttävyyden arviointi (kohta 8) muotoiltiin myös uudestaan, koska digitaalisessa mediassa pelkkä tekstin läsnäolo ei sellaisenaan tyypillisesti vielä riitä halutun viestin välittämiseen kohdeyleisölle. Tietotekniikan edistyminen ja psykologian sekä sosiologian aloilla tehty tieteellinen tutkimus ihmisten käyttäytymisestä sosiaalisessa mediassa (ja yleensä kybertoimintaympäristössä) mahdollistavat koko kohdeyleisöanalyysiprosessin uudelleenrakentamisen. Tässä työssä sitä kuitenkaan ei voida tehdä. Siksi esitetäänkin että lisätutkimusta tulisi tehdä sekä tietokoneavusteisten prosessien että vielä syvällisempien kohdeyleisöanalyysien osalta, käyttäen hyväksi viimeisimpiä ihmisen käyttäytymiseen liittyviä tutkimustuloksia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Feature extraction is the part of pattern recognition, where the sensor data is transformed into a more suitable form for the machine to interpret. The purpose of this step is also to reduce the amount of information passed to the next stages of the system, and to preserve the essential information in the view of discriminating the data into different classes. For instance, in the case of image analysis the actual image intensities are vulnerable to various environmental effects, such as lighting changes and the feature extraction can be used as means for detecting features, which are invariant to certain types of illumination changes. Finally, classification tries to make decisions based on the previously transformed data. The main focus of this thesis is on developing new methods for the embedded feature extraction based on local non-parametric image descriptors. Also, feature analysis is carried out for the selected image features. Low-level Local Binary Pattern (LBP) based features are in a main role in the analysis. In the embedded domain, the pattern recognition system must usually meet strict performance constraints, such as high speed, compact size and low power consumption. The characteristics of the final system can be seen as a trade-off between these metrics, which is largely affected by the decisions made during the implementation phase. The implementation alternatives of the LBP based feature extraction are explored in the embedded domain in the context of focal-plane vision processors. In particular, the thesis demonstrates the LBP extraction with MIPA4k massively parallel focal-plane processor IC. Also higher level processing is incorporated to this framework, by means of a framework for implementing a single chip face recognition system. Furthermore, a new method for determining optical flow based on LBPs, designed in particular to the embedded domain is presented. Inspired by some of the principles observed through the feature analysis of the Local Binary Patterns, an extension to the well known non-parametric rank transform is proposed, and its performance is evaluated in face recognition experiments with a standard dataset. Finally, an a priori model where the LBPs are seen as combinations of n-tuples is also presented

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The shift towards a knowledge-based economy has inevitably prompted the evolution of patent exploitation. Nowadays, patent is more than just a prevention tool for a company to block its competitors from developing rival technologies, but lies at the very heart of its strategy for value creation and is therefore strategically exploited for economic pro t and competitive advantage. Along with the evolution of patent exploitation, the demand for reliable and systematic patent valuation has also reached an unprecedented level. However, most of the quantitative approaches in use to assess patent could arguably fall into four categories and they are based solely on the conventional discounted cash flow analysis, whose usability and reliability in the context of patent valuation are greatly limited by five practical issues: the market illiquidity, the poor data availability, discriminatory cash-flow estimations, and its incapability to account for changing risk and managerial flexibility. This dissertation attempts to overcome these impeding barriers by rationalizing the use of two techniques, namely fuzzy set theory (aiming at the first three issues) and real option analysis (aiming at the last two). It commences with an investigation into the nature of the uncertainties inherent in patent cash flow estimation and claims that two levels of uncertainties must be properly accounted for. Further investigation reveals that both levels of uncertainties fall under the categorization of subjective uncertainty, which differs from objective uncertainty originating from inherent randomness in that uncertainties labelled as subjective are highly related to the behavioural aspects of decision making and are usually witnessed whenever human judgement, evaluation or reasoning is crucial to the system under consideration and there exists a lack of complete knowledge on its variables. Having clarified their nature, the application of fuzzy set theory in modelling patent-related uncertain quantities is effortlessly justified. The application of real option analysis to patent valuation is prompted by the fact that both patent application process and the subsequent patent exploitation (or commercialization) are subject to a wide range of decisions at multiple successive stages. In other words, both patent applicants and patentees are faced with a large variety of courses of action as to how their patent applications and granted patents can be managed. Since they have the right to run their projects actively, this flexibility has value and thus must be properly accounted for. Accordingly, an explicit identification of the types of managerial flexibility inherent in patent-related decision making problems and in patent valuation, and a discussion on how they could be interpreted in terms of real options are provided in this dissertation. Additionally, the use of the proposed techniques in practical applications is demonstrated by three fuzzy real option analysis based models. In particular, the pay-of method and the extended fuzzy Black-Scholes model are employed to investigate the profitability of a patent application project for a new process for the preparation of a gypsum-fibre composite and to justify the subsequent patent commercialization decision, respectively; a fuzzy binomial model is designed to reveal the economic potential of a patent licensing opportunity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article, we compare two strategies for atherosclerosis treatment: drugs and healthy lifestyle. Statins are the principal drugs used for the treatment of atherosclerosis. Several secondary prevention studies have demonstrated that statins can significantly reduce cardiovascular events including coronary death, the need for surgical revascularization, stroke, total mortality, as well as fatal and non-fatal myocardial infarction. These results were observed in both men and women, the elderly, smokers and non-smokers, diabetics and hypertensives. Primary prevention studies yielded similar results, although total mortality was not affected. Statins also induce atheroma regression and do not cause cancer. However, many unresolved issues remain, such as partial risk reduction, costs, several potential side effects, and long-term use by young patients. Statins act mainly as lipid-lowering drugs but pleiotropic actions are also present. Healthy lifestyle, on the other hand, is effective and inexpensive and has no harmful effects. Five items are associated with lower cardiac risk: non-smoking, BMI ≤25, regular exercise (30 min/day), healthy diet (fruits, vegetables, low-saturated fat, and 5-30 g alcohol/day). Nevertheless, there are difficulties in implementing these measures both at the individual and population levels. Changes in behavior require multidisciplinary care, including medical, nutritional, and psychological counseling. Participation of the entire society is required for such implementation, i.e., universities, schools, media, government, and medical societies. Although these efforts represent a major challenge, such a task must be faced in order to halt the atherosclerosis epidemic that threatens the world.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Gear rattle is a phenomenon that occurs when idling or lightly loaded gears collide due to engine’s torque fluctuations. This behaviour is related to vibration behaviour of the transmission system. Aim of this master’s thesis is to evaluate Adams and Adams/Machinery as a simulation tools for modelling the rattle e ect in a transmission system. A case study of tractor’s power take-o driveline, suspected to be prone to rattle, is performed in this work. Modelling methods used by Adams in this type of study are presented in the theory section while simulation model build with the software during this work is presented in the results. The Machinery toolbox is used to create gears and bearings while other model components are created with standard Adams tool set. Geometries and excitations are exported from other softwares. Results were obtained from multiple variations of a base model. These result sets and literature review suggest that Adams/Machinery may not be the most suitable tool for rattle analysis. While the system behaviour was partially captured, for accurate modelling user-written routines must be used which may be more easily performed with other tools. Further research about this topic is required.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Currently, the power generation is one of the most significant life aspects for the whole man-kind. Barely one can imagine our life without electricity and thermal energy. Thus, different technologies for producing those types of energy need to be used. Each of those technologies will always have their own advantages and disadvantages. Nevertheless, every technology must satisfy such requirements as efficiency, ecology safety and reliability. In the matter of the power generation with nuclear energy utilization these requirements needs to be highly main-tained, especially since accidents on nuclear power plants may cause very long term deadly consequences. In order to prevent possible disasters related to the accident on a nuclear power plant strong and powerful algorithms were invented in last decades. Such algorithms are able to manage calculations of different physical processes and phenomena of real facilities. How-ever, the results acquired by the computing must be verified with experimental data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Finnish Securities Markets are being harmonized to enable better, more reliable and timely settlement of securities. Omnibus accounts are a common practice in the European securities markets. Finland forbids the use of omnibus accounts from its domestic investors. There is a possibility that the omnibus account usage is allowed for Finnish investors in the future. This study aims to build a comprehensive image to Finnish investors and account operators in determining the costs and benefits that the omnibus account structure would have for them. This study uses qualitative research methods. A literature review provides the framework for this study. Different kinds of research articles, regulatory documents, studies performed by European organisations, and Finnish news reportages are used to analyse the costs and benefits of omnibus accounts. The viewpoint is strictly of account operators and investors, and different effects on them are contemplated. The results of the analysis show that there are a number of costs and benefits that investors and account operators must take into consideration regarding omnibus accounts. The costs are related to development of IT-systems so that participants are able to adapt to the new structure and operate according to its needs. Decrease in the holdings’ transparency is a disadvantage of the structure and needs to be assessed precisely to avoid some problems it might bring. Benefits are mostly related to the increased competition in the securities markets as well as to the possible cost reductions of securities settlement. The costs and benefits were analysed according to the study plan of this thesis and as a result, the significance and impact of omnibus accounts to Finnish investors and account operators depends on the competition level and the decisions that all market participants make when determining if the account structure is beneficial for their operations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACT The present study aims to evaluate crop, pasture and forest land prices in Brazil, between 1994 and 2010, in the light of Post-Keynesian theory. The results provide evidence that land, more than just a simple factor of production, must be conceived of as an economic asset. In fact, the price of rural land is determined not only by the expected profitability deriving from agricultural activities but also by the agents' expectations about its future appreciation and liquidity in an economic environment permeated with uncertainty. In this context, as an object of speculation, land has been particularly important as a store of value.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION Theories of humour are traditionally divided into two classes: superiority or relief theories, and incongruity or ambiguity theories. As their names imply, the former tend to ascribe amusement primarily to a particular attitude of mind, while the latter account for it by describing its objects as having a particular quality. Enjoyment as an attitude is always a response to an object present to the mind or feelings. If, then, enjoyment in amusement is identical with feelings of superiority or relief, its objects must always display characteristics of inferiority or inhibition. But the enjoyment of humour seems to be distinguishable from a reaction to particular kinds of topic, and from any personal relation felt between the subject and the objects of his amusement. Incongruity theories do not explicitly ascribe the enjoyment of humour to a particular range of topics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Euclidean distance matrix analysis (EDMA) methods are used to distinguish whether or not significant difference exists between conformational samples of antibody complementarity determining region (CDR) loops, isolated LI loop and LI in three-loop assembly (LI, L3 and H3) obtained from Monte Carlo simulation. After the significant difference is detected, the specific inter-Ca distance which contributes to the difference is identified using EDMA.The estimated and improved mean forms of the conformational samples of isolated LI loop and LI loop in three-loop assembly, CDR loops of antibody binding site, are described using EDMA and distance geometry (DGEOM). To the best of our knowledge, it is the first time the EDMA methods are used to analyze conformational samples of molecules obtained from Monte Carlo simulations. Therefore, validations of the EDMA methods using both positive control and negative control tests for the conformational samples of isolated LI loop and LI in three-loop assembly must be done. The EDMA-I bootstrap null hypothesis tests showed false positive results for the comparison of six samples of the isolated LI loop and true positive results for comparison of conformational samples of isolated LI loop and LI in three-loop assembly. The bootstrap confidence interval tests revealed true negative results for comparisons of six samples of the isolated LI loop, and false negative results for the conformational comparisons between isolated LI loop and LI in three-loop assembly. Different conformational sample sizes are further explored by combining the samples of isolated LI loop to increase the sample size, or by clustering the sample using self-organizing map (SOM) to narrow the conformational distribution of the samples being comparedmolecular conformations. However, there is no improvement made for both bootstrap null hypothesis and confidence interval tests. These results show that more work is required before EDMA methods can be used reliably as a method for comparison of samples obtained by Monte Carlo simulations.