1000 resultados para Software Modification
Resumo:
Over the last decades, calibration techniques have been widely used to improve the accuracy of robots and machine tools since they only involve software modification instead of changing the design and manufacture of the hardware. Traditionally, there are four steps are required for a calibration, i.e. error modeling, measurement, parameter identification and compensation. The objective of this thesis is to propose a method for the kinematics analysis and error modeling of a newly developed hybrid redundant robot IWR (Intersector Welding Robot), which possesses ten degrees of freedom (DOF) where 6-DOF in parallel and additional 4-DOF in serial. In this article, the problem of kinematics modeling and error modeling of the proposed IWR robot are discussed. Based on the vector arithmetic method, the kinematics model and the sensitivity model of the end-effector subject to the structure parameters is derived and analyzed. The relations between the pose (position and orientation) accuracy and manufacturing tolerances, actuation errors, and connection errors are formulated. Computer simulation is performed to examine the validity and effectiveness of the proposed method.
Resumo:
Software evolution, and particularly its growth, has been mainly studied at the file (also sometimes referred as module) level. In this paper we propose to move from the physical towards a level that includes semantic information by using functions or methods for measuring the evolution of a software system. We point out that use of functions-based metrics has many advantages over the use of files or lines of code. We demonstrate our approach with an empirical study of two Free/Open Source projects: a community-driven project, Apache, and a company-led project, Novell Evolution. We discovered that most functions never change; when they do their number of modifications is correlated with their size, and that very few authors who modify each; finally we show that the departure of a developer from a software project slows the evolution of the functions that she authored.
Resumo:
The first objective of this study was to find out reliable laboratory methods to predict the effect of enzymes on specific energy consumption and fiber properties of TMP pulp. The second one was to find with interactive software called “Knowledge discovery in databases” enzymes or other additives that can be used in finding a solution to reduce energy consumption of TMP pulp. The chemical composition of wood and enzymes, which have activity on main wood components were presented in the literature part of the work. The results of previous research in energy reduction of TMP process with enzymes were also highlighted. The main principles of knowledge discovery have been included in literature part too. The experimental part of the work contains the methods description in which the standard size chip, crushed chip and fiberized spruce chip (fiberized pulp) were used. Different types of enzymatic treatment with different dosages and time were tested during the experiments and showed. Pectinase, endoglucanase and mixture of enzymes were used for evaluation of method reliability. The fines content and fiber length of pulp was measured and used as evidence of enzymes' effect. The refining method with “Bauer” laboratory disc refiner was evaluated as not highly reliable. It was not able to provide high repeatability of results, because of uncontrolled feeding capacity and refining consistency. The refining method with Valley refiner did not have a lot of variables and showed stable and repeatable results in energy saving. The results of experiments showed that efficient enzymes impregnation is probably the main target with enzymes application for energy saving. During the work the fiberized pulp showed high accessibility to enzymatic treatment and liquid penetration without special impregnating equipment. The reason was that fiberized pulp has larger wood surface area and thereby the contact area between the enzymatic solution and wood is also larger. Standard size chip and crushed chip treatment without special impregnator of enzymatic solution was evaluated as not efficient and did not show visible, repeatable results in energy consumption decrease. Thereby it was concluded that using of fiberized pulp and Valley refiner for measurements of enzymes' effectiveness in SEC decrease is more suitable than normal size chip and crushed chip with “Bauer” refiner. Endoglucanase with 5 kg/t dosage showed about 20% energy consumption decrease. Mixture of enzymes with 1.5 kg/t dosage showed about 15% decrease of energy consumption during the refining. Pectinase at different dosages and treatment times did not show significant effect on energy consumption. Results of knowledge discovery in databases showed the xylanase, cellulase and pectinase blend as most promising for energy reduction in TMP process. Surfactants were determined as effective additives for energy saving with enzymes.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
This article aims to reflect on the qualitative analysis regarding the modification and the acquisition of new knowledge, as well as to identify the contribution of computer use in an intervention activity. It was used content analysis and it was created categories and units of analysis and from the frames containing the analysis of the categories it was randomly chosen four groups which led to their evaluation. Concerning to such frames of analysis and evaluation of the frequency of the records it was structured a metatext. The result of the analysis has shown the students' difficulty to realize their records, to express their interpretations and understandings in their mathematical activities by using the English language. It was noticed that the activity performed using the software Geogebra, based on the constructionist view, make possible the students to identify the modification and acquisition of new concepts.
Resumo:
The algorithms and graphic user interface software package ?OPT-PROx? are developed to meet food engineering needs related to canned food thermal processing simulation and optimization. The adaptive random search algorithm and its modification coupled with penalty function?s approach, and the finite difference methods with cubic spline approximation are utilized by ?OPT-PROx? package (http://tomakechoice. com/optprox/index.html). The diversity of thermal food processing optimization problems with different objectives and required constraints are solvable by developed software. The geometries supported by the ?OPT-PROx? are the following: (1) cylinder, (2) rectangle, (3) sphere. The mean square error minimization principle is utilized in order to estimate the heat transfer coefficient of food to be heated under optimal condition. The developed user friendly dialogue and used numerical procedures makes the ?OPT-PROx? software useful to food scientists in research and education, as well as to engineers involved in optimization of thermal food processing.
Resumo:
Ce travail de thèse présente deux grands axes. Le premier axe, touche les traitements du bois dans le but principal de réduire les variations dimensionnelles et d’améliorer la résistance à l’attaque des champignons lignivores. Le second axe quant à lui, touche l’aspect environnemental du traitement acide citrique-glycérol. Ce dernier a pour but principal de démontrer que le prolongement de la durée de vie en service du produit lambris traité, compense les impacts environnementaux causés par ce traitement. Dans le premier axe, deux traitements ont été réalisés sur deux essences de pin (Pinus strobus L. et Pinus contorta D.). Un traitement à l’anhydride maléique et un autre traitement avec une solution d’acide citrique – glycérol brute (AC-G). Dans le premier cas, les effets de deux paramètres (la durée de séchage et la température d’estérification) sur les résultats des essais de stabilité dimensionnelle, de résistance à la dégradation fongique et de vieillissement accéléré ont été évalués. Trois niveaux de durée de séchage après imprégnation (12 h, 18 h et 24 h) et trois niveaux de température d’estérification (140 °C, 160 °C et 180 °C) ont été considérés. Dans le second cas, après identification du meilleur catalyseur (HCl) et du meilleur ratio acide citrique – glycérol (3/1) pendant les essais préliminaires, les performances de ce traitement sur la stabilité dimensionnelle, la résistance à la pourriture fongique, la dureté de surface et l’adhérence des couches de revêtement de peinture sur la surface du substrat bois ont été analysées. Les résultats obtenus ont été appuyés par une suite d’analyses qualitatives et quantitatives pour mieux comprendre et expliquer. Les analyses qualitatives sont : (i) la spectroscopie infrarouge à transformée de Fourier (IRTF) et (ii) la microscopie électronique à balayage (MEB) tandis que la quantitative, l’analyse par perte de masse a été faite par pesée. Dans le second axe, une analyse des impacts environnementaux du traitement AC-G a été effectuée par le biais du logiciel SimaPro v8. La base de données Ecoinvent v3 et la méthode d’analyse d’impact Impact 2002+ ont été utilisées dans cette partie du travail de thèse. Sur la base des résultats du second traitement (AC-G) et des travaux disponibles dans la littérature, nous avons estimé, une durée de vie en service des lambris traités. Les différents scénarios de la durée de vie du lambris traité mis sur pied par rapport à celle offerte aujourd’hui par l’industrie, nous permettent de modéliser les impacts environnementaux du traitement. A cette fin, l’analyse de cycle de vie (ACV) a été utilisée comme outil de conception. En conclusion, les paramètres, durée de séchage et température d’estérification influencent les résultats obtenus dans le cas du traitement du bois à l’anhydride maléique. La combinaison 24 h de séchage et 180 °C, température d’estérification, représente les paramètres qui offrent les meilleurs résultats de stabilité dimensionnelle, de résistance à la dégradation fongique et de vieillissement accéléré. Le traitement AC-G améliore la stabilité dimensionnelle, la résistance à la dégradation fongique et la dureté de surface des échantillons. Cependant, le traitement réduit l’adhérence des couches de peinture. Les impacts environnementaux produits par le traitement AC-G sont majoritairement liés à la consommation de la ressource énergie (électricité). Le traitement prolonge la durée de vie en service du lambris traité et il a été mis en évidence que le scénario de durée de vie qui permettrait que le lambris traité puisse se présenter comme un produit à faible impact environnemental par rapport au lambris non traité est celui d’une durée de vie de 55 ans.
Resumo:
Bulk electric waste plastics were recycled and reduced in size into plastic chips before pulverization or cryogenic grinding into powders. Two major types of electronic waste plastics were used in this investigation: acrylonitrile butadiene styrene (ABS) and high impact polystyrene (HIPS). This research investigation utilized two approaches for incorporating electronic waste plastics into asphalt pavement materials. The first approach was blending and integrating recycled and processed electronic waste powders directly into asphalt mixtures and binders; and the second approach was to chemically treat recycled and processed electronic waste powders with hydro-peroxide before blending into asphalt mixtures and binders. The chemical treatment of electronic waste (e-waste) powders was intended to strengthen molecular bonding between e-waste plastics and asphalt binders for improved low and high temperature performance. Superpave asphalt binder and mixture testing techniques were conducted to determine the rheological and mechanical performance of the e-waste modified asphalt binders and mixtures. This investigation included a limited emissions-performance assessment to compare electronic waste modified asphalt pavement mixture emissions using SimaPro and performance using MEPDG software. Carbon dioxide emissions for e-waste modified pavement mixtures were compared with conventional asphalt pavement mixtures using SimaPro. MEPDG analysis was used to determine rutting potential between the various e-waste modified pavement mixtures and the control asphalt mixture. The results from this investigation showed the following: treating the electronic waste plastics delayed the onset of tertiary flow for electronic waste mixtures, electronic waste mixtures showed some improvement in dynamic modulus results at low temperatures versus the control mixture, and tensile strength ratio values for treated e-waste asphalt mixtures were improved versus the control mixture.
Resumo:
This article aimed at comparing the accuracy of linear measurement tools of different commercial software packages. Eight fully edentulous dry mandibles were selected for this study. Incisor, canine, premolar, first molar and second molar regions were selected. Cone beam computed tomography (CBCT) images were obtained with i-CAT Next Generation. Linear bone measurements were performed by one observer on the cross-sectional images using three different software packages: XoranCat®, OnDemand3D® and KDIS3D®, all able to assess DICOM images. In addition, 25% of the sample was reevaluated for the purpose of reproducibility. The mandibles were sectioned to obtain the gold standard for each region. Intraclass coefficients (ICC) were calculated to examine the agreement between the two periods of evaluation; the one-way analysis of variance performed with the post-hoc Dunnett test was used to compare each of the software-derived measurements with the gold standard. The ICC values were excellent for all software packages. The least difference between the software-derived measurements and the gold standard was obtained with the OnDemand3D and KDIS3D (-0.11 and -0.14 mm, respectively), and the greatest, with the XoranCAT (+0.25 mm). However, there was no statistical significant difference between the measurements obtained with the different software packages and the gold standard (p> 0.05). In conclusion, linear bone measurements were not influenced by the software package used to reconstruct the image from CBCT DICOM data.
Resumo:
Purpose: Use of lipid nanoemulsions as carriers of drugs for therapeutic or diagnostic purposes has been increasingly studied. Here, it was tested whether modifications of core particle constitution could affect the characteristics and biologic properties of lipid nanoemulsions. Methods: Three nanoemulsions were prepared using cholesteryl oleate, cholesteryl stearate, or cholesteryl linoleate as main core constituents. Particle size, stability, pH, peroxidation of the nanoemulsions, and cell survival and uptake by different cell lines were evaluated. Results: It was shown that cholesteryl stearate nanoemulsions had the greatest particle size and all three nanoemulsions were stable during the 237-day observation period. The pH of the three nanoemulsion preparations tended to decrease over time, but the decrease in pH of cholesteryl stearate was smaller than that of cholesteryl oleate and cholesteryl linoleate. Lipoperoxidation was greater in cholesteryl linoleate than in cholesteryl oleate and cholesteryl stearate. After four hours' incubation of human umbilical vein endothelial cells (HUVEC) with nanoemulsions, peroxidation was minimal in the presence of cholesteryl oleate and more pronounced with cholesteryl linoleate and cholesteryl stearate. In contrast, macrophage incubates showed the highest peroxidation rates with cholesteryl oleate. Cholesteryl linoleate induced the highest cell peroxidation rates, except in macrophages. Uptake of cholesteryl oleate nanoemulsion by HUVEC and fibroblasts was greater than that of cholesteryl linoleate and cholesteryl stearate. Uptake of the three nanoemulsions by monocytes was equal. Uptake of cholesteryl oleate and cholesteryl linoleate by macrophages was negligible, but macrophage uptake of cholesteryl stearate was higher. In H292 tumor cells, cholesteryl oleate showed the highest uptakes. HUVEC showed higher survival rates when incubated with cholesteryl stearate and smaller survival with cholesteryl linoleate. H292 survival was greater with cholesteryl stearate. Conclusion: Although all three nanoemulsion types were stable for a long period, considerable differences were observed in size, oxidation status, and cell survival and nanoemulsion uptake in all tested cell lines. Those differences may be helpful in protocol planning and interpretation of data from experiments with lipid nanoemulsions.
Resumo:
Transanal endorectal pull-through (TAEPT) surgery is primarily performed for rectosigmoid aganglionosis, generally with excellent results. There is evidence that overstretching the anus and tension traction in the sigmoid during the procedure could impair the final continence of the patient. Many researchers suggest the use of small umbilical or laparoscopic access to aid in colon mobilization, thus preventing excessive handling within the anal canal. We assumed that transabdominal mobilization of the sigmoid could be prevented by utilizing the NOTES (natural orifices transluminal endoscopic surgery) technique. We performed a TAEPT with NOTES access of the sigmoid vascular pedicle, keeping the surgery exclusively transanal, which prevented scars in the abdomen and minimized the stretching of perineal structures.
Resumo:
This paper presents SMarty, a variability management approach for UML-based software product lines (PL). SMarty is supported by a UML profile, the SMartyProfile, and a process for managing variabilities, the SMartyProcess. SMartyProfile aims at representing variabilities, variation points, and variants in UML models by applying a set of stereotypes. SMartyProcess consists of a set of activities that is systematically executed to trace, identify, and control variabilities in a PL based on SMarty. It also identifies variability implementation mechanisms and analyzes specific product configurations. In addition, a more comprehensive application of SMarty is presented using SEI's Arcade Game Maker PL. An evaluation of SMarty and related work are discussed.
Resumo:
The PHENIX experiment at the Relativistic Heavy Ion Collider has performed systematic measurements of phi meson production in the K(+)K(-) decay channel at midrapidity in p + p, d + Au, Cu + Cu, and Au + Au collisions at root s(NN) = 200 GeV. Results are presented on the phi invariant yield and the nuclear modification factor R(AA) for Au + Au and Cu + Cu, and R(dA) for d + Au collisions, studied as a function of transverse momentum (1 < p(T) < 7 GeV/c) and centrality. In central and midcentral Au + Au collisions, the R(AA) of phi exhibits a suppression relative to expectations from binary scaled p + p results. The amount of suppression is smaller than that of the pi(0) and the. in the intermediate p(T) range (2-5 GeV/c), whereas, at higher p(T), the phi, pi(0), and. show similar suppression. The baryon (proton and antiproton) excess observed in central Au + Au collisions at intermediate p(T) is not observed for the phi meson despite the similar masses of the proton and the phi. This suggests that the excess is linked to the number of valence quarks in the hadron rather than its mass. The difference gradually disappears with decreasing centrality, and, for peripheral collisions, the R(AA) values for both particle species are consistent with binary scaling. Cu + Cu collisions show the same yield and suppression as Au + Au collisions for the same number of N(part). The R(dA) of phi shows no evidence for cold nuclear effects within uncertainties.