851 resultados para Approaching methodology
Resumo:
Fifty Bursa of Fabricius (BF) were examined by conventional optical microscopy and digital images were acquired and processed using Matlab® 6.5 software. The Artificial Neuronal Network (ANN) was generated using Neuroshell® Classifier software and the optical and digital data were compared. The ANN was able to make a comparable classification of digital and optical scores. The use of ANN was able to classify correctly the majority of the follicles, reaching sensibility and specificity of 89% and 96%, respectively. When the follicles were scored and grouped in a binary fashion the sensibility increased to 90% and obtained the maximum value for the specificity of 92%. These results demonstrate that the use of digital image analysis and ANN is a useful tool for the pathological classification of the BF lymphoid depletion. In addition it provides objective results that allow measuring the dimension of the error in the diagnosis and classification therefore making comparison between databases feasible.
Resumo:
This work describes the methodology, basic procedures and instrumental employed by the Solar Energy Laboratory at Universidade Federal do Rio Grande do Sul for the determination of current-voltage characteristic curves of photovoltaic modules. According to this methodology, I-V characteristic curves were acquired for several modules under diverse conditions. The main electrical parameters were determined and the temperature and irradiance influence on photovoltaic modules performance was quantified. It was observed that most of the tested modules presented output power values considerably lower than those specified by the manufacturers. The described hardware allows the testing of modules with open-circuit voltage up to 50 V and short-circuit current up to 8 A.
Resumo:
This work presents recent results concerning a design methodology used to estimate the positioning deviation for a gantry (Cartesian) manipulator, related mainly to structural elastic deformation of components during operational conditions. The case-study manipulator is classified as gantry type and its basic dimensions are 1,53m x 0,97m x 1,38m. The dimensions used for the calculation of effective workspace due to end-effector path displacement are: 1m x 0,5m x 0,5m. The manipulator is composed by four basic modules defined as module X, module Y, module Z and terminal arm, where is connected the end-effector. Each module controlled axis performs a linear-parabolic positioning movement. The planning path algorithm has the maximum velocity and the total distance as input parameters for a given task. The acceleration and deceleration times are the same. Denavit-Hartemberg parameterization method is used in the manipulator kinematics model. The gantry manipulator can be modeled as four rigid bodies with three degrees-of-freedom in translational movements, connected as an open kinematics chain. Dynamic analysis were performed considering inertial parameters specification such as component mass, inertia and center of gravity position of each module. These parameters are essential for a correct manipulator dynamic modelling, due to multiple possibilities of motion and manipulation of objects with different masses. The dynamic analysis consists of a mathematical modelling of the static and dynamic interactions among the modules. The computation of the structural deformations uses the finite element method (FEM).
Resumo:
This work presents a methodology for the development of Teleoperated Robotic Systems through the Internet. Initially, it is presented a bibliographical review of the Telerobotic systems that uses Internet as way of control. The methodology is implemented and tested through the development of two systems. The first is a manipulator with two degrees of freedom commanded remotely through the Internet denominated RobWebCam (http://www.graco.unb.br/robwebcam). The second is a system which teleoperates an ABB (Asea Brown Boveri) Industrial Robot of six degrees of freedom denominated RobWebLink (http://webrobot.graco.unb.br). RobWebCam is composed of a manipulator with two degrees of freedom, a video camera, Internet, computers and communication driver between the manipulator and the Unix system; and RobWebLink composed of the same components plus the Industrial Robot. With the use of this technology, it is possible to move far distant positioning objects minimizing transport costs, materials and people; acting in real time in the process that is wanted to be controller. This work demonstrates that the teleoperating via Internet of robotic systems and other equipments is viable, in spite of using rate transmission data with low bandwidth. Possible applications include remote surveillance, control and remote diagnosis and maintenance of machines and equipments.
Resumo:
Industrial applications demand that robots operate in agreement with the position and orientation of their end effector. It is necessary to solve the kinematics inverse problem. This allows the displacement of the joints of the manipulator to be determined, to accomplish a given objective. Complete studies of dynamical control of joint robotics are also necessary. Initially, this article focuses on the implementation of numerical algorithms for the solution of the kinematics inverse problem and the modeling and simulation of dynamic systems. This is done using real time implementation. The modeling and simulation of dynamic systems are performed emphasizing off-line programming. In sequence, a complete study of the control strategies is carried out through the study of several elements of a robotic joint, such as: DC motor, inertia, and gearbox. Finally a trajectory generator, used as input for a generic group of joints, is developed and a proposal of the controller's implementation of joints, using EPLD development system, is presented.
Resumo:
Nowadays, the upwind three bladed horizontal axis wind turbine is the leading player on the market. It has been found to be the best industrial compromise in the range of different turbine constructions. The current wind industry innovation is conducted in the development of individual turbine components. The blade constitutes 20-25% of the overall turbine budget. Its optimal operation in particular local economic and wind conditions is worth investigating. The blade geometry, namely the chord, twist and airfoil type distributions along the span, responds to the output measures of the blade performance. Therefore, the optimal wind blade geometry can improve the overall turbine performance. The objectives of the dissertation are focused on the development of a methodology and specific tool for the investigation of possible existing wind blade geometry adjustments. The novelty of the methodology presented in the thesis is the multiobjective perspective on wind blade geometry optimization, particularly taking simultaneously into account the local wind conditions and the issue of aerodynamic noise emissions. The presented optimization objective approach has not been investigated previously for the implementation in wind blade design. The possibilities to use different theories for the analysis and search procedures are investigated and sufficient arguments derived for the usage of proposed theories. The tool is used for the test optimization of a particular wind turbine blade. The sensitivity analysis shows the dependence of the outputs on the provided inputs, as well as its relative and absolute divergences and instabilities. The pros and cons of the proposed technique are seen from the practical implementation, which is documented in the results, analysis and conclusion sections.
Resumo:
More discussion is required on how and which types of biomass should be used to achieve a significant reduction in the carbon load released into the atmosphere in the short term. The energy sector is one of the largest greenhouse gas (GHG) emitters and thus its role in climate change mitigation is important. Replacing fossil fuels with biomass has been a simple way to reduce carbon emissions because the carbon bonded to biomass is considered as carbon neutral. With this in mind, this thesis has the following objectives: (1) to study the significance of the different GHG emission sources related to energy production from peat and biomass, (2) to explore opportunities to develop more climate friendly biomass energy options and (3) to discuss the importance of biogenic emissions of biomass systems. The discussion on biogenic carbon and other GHG emissions comprises four case studies of which two consider peat utilization, one forest biomass and one cultivated biomasses. Various different biomass types (peat, pine logs and forest residues, palm oil, rapeseed oil and jatropha oil) are used as examples to demonstrate the importance of biogenic carbon to life cycle GHG emissions. The biogenic carbon emissions of biomass are defined as the difference in the carbon stock between the utilization and the non-utilization scenarios of biomass. Forestry-drained peatlands were studied by using the high emission values of the peatland types in question to discuss the emission reduction potential of the peatlands. The results are presented in terms of global warming potential (GWP) values. Based on the results, the climate impact of the peat production can be reduced by selecting high-emission-level peatlands for peat production. The comparison of the two different types of forest biomass in integrated ethanol production in pulp mill shows that the type of forest biomass impacts the biogenic carbon emissions of biofuel production. The assessment of cultivated biomasses demonstrates that several selections made in the production chain significantly affect the GHG emissions of biofuels. The emissions caused by biofuel can exceed the emissions from fossil-based fuels in the short term if biomass is in part consumed in the process itself and does not end up in the final product. Including biogenic carbon and other land use carbon emissions into the carbon footprint calculations of biofuel reveals the importance of the time frame and of the efficiency of biomass carbon content utilization. As regards the climate impact of biomass energy use, the net impact on carbon stocks (in organic matter of soils and biomass), compared to the impact of the replaced energy source, is the key issue. Promoting renewable biomass regardless of biogenic GHG emissions can increase GHG emissions in the short term and also possibly in the long term.
Resumo:
In this work the separation of multicomponent mixtures in counter-current columns with supercritical carbon dioxide has been investigated using a process design methodology. First the separation task must be defined, then phase equilibria experiments are carried out, and the data obtained are correlated with thermodynamic models or empirical functions. Mutual solubilities, Ki-values, and separation factors aij are determined. Based on this data possible operating conditions for further extraction experiments can be determined. Separation analysis using graphical methods are performed to optimize the process parameters. Hydrodynamic experiments are carried out to determine the flow capacity diagram. Extraction experiments in laboratory scale are planned and carried out in order to determine HETP values, to validate the simulation results, and to provide new materials for additional phase equilibria experiments, needed to determine the dependence of separation factors on concetration. Numerical simulation of the separation process and auxiliary systems is carried out to optimize the number of stages, solvent-to-feed ratio, product purity, yield, and energy consumption. Scale-up and cost analysis close the process design. The separation of palmitic acid and (oleic+linoleic) acids from PFAD-Palm Fatty Acids Distillates was used as a case study.
Resumo:
The Graphite furnace atomic absorption spectrometry (GF AAS) was the technique chosen by the inorganic contamination laboratory (INCQ/ FIOCRUZ) to be validated and applied in routine analysis for arsenic detection and quantification. The selectivity, linearity, sensibility, detection, and quantification limits besides accuracy and precision parameters were studied and optimized under Stabilized Temperature Platform Furnace (STPF) conditions. The limit of detection obtained was 0.13 µg.L-1 and the limit of quantification was 1.04 µg.L-1, with an average precision, for total arsenic, less than 15% and an accuracy of 96%. To quantify the chemical species As(III) and As(V), an ion-exchange resin (Dowex 1X8, Cl- form) was used and the physical-chemical parameters were optimized resulting in a recuperation of 98% of As(III) and of 90% of As(V). The method was applied to groundwater, mineral water, and hemodialysis purified water samples. All results obtained were lower than the maximum limit values established by the legal Brazilian regulations, in effect, 50, 10, and 5 µg.L-1 para As total, As(III) e As(V), respectively. All results were statistically evaluated.
Resumo:
Sodium alginate needs the presence of calcium ions to gelify. For this reason, the contribution of the calcium source in a fish muscle mince added by sodium alginate, makes gelification possible, resulting a restructured fish product. The three different calcium sources considered were: Calcium Chloride (CC); Calcium Caseinate (CCa); and Calcium lactate (CLa). Several physical properties were analyzed, including mechanical properties, colour and cooking loss. Response Surface Methodology (RSM) was used to determine the contribution of different calcium sources to a restructured fish muscle. The calcium source that modifies the system the most is CC. A combination of CC and sodium alginate weakened mechanical properties as reflected in the negative linear contribution of sodium alginate. Moreover, CC by itself increased lightness and cooking loss. The mechanical properties of restructured fish muscle elaborated were enhanced by using CCa and sodium alginate, as reflected in the negative linear contribution of sodium alginate. Also, CCa increased cooking loss. The role of CLa combined with sodium alginate was not so pronounced in the system discussed here.
Resumo:
During postharvest, lettuce is usually exposed to adverse conditions (e.g. low relative humidity) that reduce the vegetable quality. In order to evaluate its shelf life, a great number of quality attributes must be analyzed, which requires careful experimental design, and it is time consuming. In this study, the modified Global Stability Index method was applied to estimate the quality of butter lettuce at low relative humidity during storage discriminating three lettuce zones (internal, middle, and external). The results indicated that the most relevant attributes were: the external zone - relative water content, water content , ascorbic acid, and total mesophilic counts; middle zone - relative water content, water content, total chlorophyll, and ascorbic acid; internal zone - relative water content, bound water, water content, and total mesophilic counts. A mathematical model that takes into account the Global Stability Index and overall visual quality for each lettuce zone was proposed. Moreover, the Weibull distribution was applied to estimate the maximum vegetable storage time which was 5, 4, and 3 days for the internal, middle, and external zone, respectively. When analyzing the effect of storage time for each lettuce zone, all the indices evaluated in the external zone of lettuce presented significant differences (p < 0.05). For both, internal and middle zones, the attributes presented significant differences (p < 0.05), except for water content and total chlorophyll.
Resumo:
This study aims to optimize an alternative method of extraction of carrageenan without previous alkaline treatment and ethanol precipitation using Response Surface Methodology (RSM). In order to introduce an innovation in the isolation step, atomization drying was used reducing the time for obtaining dry carrageenan powder. The effects of extraction time and temperature on yield, gel strength, and viscosity were evaluated. Furthermore, the extracted material was submitted to structural analysis, by infrared spectroscopy and nuclear magnetic resonance spectroscopy (¹H-NMR), and chemical composition analysis. Results showed that the generated regression models adequately explained the data variation. Carrageenan yield and gel viscosity were influenced only by the extraction temperature. However, gel strength was influenced by both, extraction time and extraction temperature. Optimal extraction conditions were 74 ºC and 4 hours. In these conditions, the carrageenan extract properties determined by the polynomial model were 31.17%, 158.27 g.cm-2, and 29.5 cP for yield, gel strength, and viscosity, respectively, while under the experimental conditions they were 35.8 ± 4.68%, 112.50 ± 4.96 g.cm-2, and 16.01 ± 1.03 cP, respectively. The chemical composition, nuclear magnetic resonance spectroscopy, and infrared spectroscopy analyses showed that the crude carrageenan extracted is composed mainly of κ-carrageenan.
Resumo:
Assessing fish consumption is complex and involves several factors; however, the use of questionnaires in surveys and the use of the Internet as tool to collect data have been considered promising approaches. Therefore, the objective of this research was to design a data collection technique using a questionnaire to assess fish consumption by making it available on a specific home page on the Internet. A bibliographical survey or review was carried out to identify the features of the instrument, and therefore pre-tests were conducted with previous instruments, followed by the Focus Group technique. Specialists then performed an analysis and conducted an online pre-test. Multivariate data analysis was applied using the SmartPLS software. The results indicate that 1.966 participants belonging to the University of São Paulo (USP) community participated in the test, and after the exclusion of some variables, a statistically significant results were obtained. The final constructs comprised consumption, quality, and general characteristics. The instrument consisted of behavioral statements in a 5-point Likert scale and multiple-choice questions. The Cronbach's alpha reliability coefficient was 0.66 for general characteristics, 0.98 for quality, and 0.91 for consumption, which indicate good reliability of the instrument. In conclusion, the results proved that the Internet assessment is efficient. The instrument of analysis allowed us to better understand the process of buying and consuming fish in the country, and it can be used as base for further research.