999 resultados para Perl`s method
Resumo:
Oxidized starch is a key component in the paper industry, where it is used as both surfacing sizer and filler. Large quantities are annually used for this purpose; however, the methods for the oxidation are not environmentally friendly. In our research, we have studied the possibility to replace the harmful oxidation agents, such as hypochlorite or iodates and transition metal catalysts, with a more environmentally friendly oxidant, hydrogen peroxide (H2O2), and a special metal complex catalyst (FePcS), of which only a small amount is needed. The work comprised batch and semi-batch studies by H2O2, ultrasound studies of starch particles, determination of low-molecular by-products and determination of the decomposition kinetics of H2O2 in the presence of starch and the catalyst. This resulted in a waste-free oxidation method, which only produces water and oxygen as side products. The starch oxidation was studied in both semi-batch and batch modes in respective to the oxidant (H2O2) addition. The semi-batch mode proved to yield a sufficient degree of substitution (COOH groups) for industrial purposes. Treatment of starch granules by ultrasound was found to improve the reactivity of starch. The kinetic results were found out to have a rather complex pattern – several oxidation phases were observed, apparently due to the fact that the oxidation reaction in the beginning only took place on the surface, whereas after a prolonged reaction time, partial degradation of the solid starch granules allowed further reaction in the interior parts. Batch-mode experiments enabled a more detailed study of the mechanisms of starch in the presence of H2O2 and the catalyst, but yielded less oxidized starch due to rapid decomposition of H2O2 due to its high concentrations. The effect of the solid-liquid (S/L) ratio in the reaction system was studied in batch experiments. These studies revealed that the presence of the catalyst and the starch enhance the H2O2 decomposition.
Resumo:
Lyhyet toimitusajat tuovat yrityksille kilpailuetua nopeasti muuttuvassa teollisuusympäristössä. Tämän diplomityön ensisijaisena tavoitteena on löytää kirjallisuuden avulla yrityksen käyttöön soveltuva menetelmä, joka soveltuu systemaattiseen läpimenoaikojen lyhentämiseen. Tärkeää on myös varmistaa valitun menetelmän soveltuvuus kohdeyrityksen ympäristöön. Työn toisena tavoitteena on ymmärtää, että minkälaisella panostuksella yhden päivän läpimenoaika voidaan saavuttaa. Kirjallisuustutkimuksen avulla on valittu tarkoitukseen sopiva toimintamalli. Menetelmä on testattu yhdellä tuotantolinjalla ja saadut tulokset sekä palaute osoittavat, että se näyttäisi soveltuvan kohdeyrityksen käyttöön. Tuotantolinjalle on tehty toimintasuunnitelma yhden päivän toimitusajan saavuttamiseksi vuoden 2013 aikana. Haasteen laajuutta koko kohdeyrityksessä on tutkittu erillisessä ideointisessiossa. Session tulosten perusteella on tehty prioriteettilista, joka antaa käsityksen toimitusajan merkittävän lyhentämisen vaatimuksista. Yleisesti ottaen kysynnän vaihtelun hallinta on suurin haaste, mutta useita ratkaisuvaihtoehtoja tämän hallitsemiseksi on tunnistettu.
Resumo:
The purpose of this study is to develop a crowdsourced videographic research method for consumer culture research. Videography provides opportunities for expressing contextual and culturally embedded relations. Thus, developing new ways to conduct videographic research is meaningful. This study develops the crowdsourced videographic method based on a literature review and evaluation of a focal study. The literature review follows a qualitative systematic review process. Through the literature review, based on different methodological, crowdsourcing and consumer research related literature, this study defines the method, its application process and evaluation criteria. Furthermore, the evaluation of the focal study, where the method was applied, completes the study. This study applies professional review with self-evaluation as a form of evaluation, drawing from secondary data including research task description, screenshots of the mobile application used in the focal study, videos collected from the participants, and self-evaluation by the author. The focal study is analyzed according to its suitability to consumer culture research, research process and quality. Definitions and descriptions of the research method, its process and quality criteria form the theoretical contribution of this study. Evaluating the focal study using these definitions underlines some best practices of this type of research, generating the practical contribution of this study. Finally, this study provides ideas for future research. First, defining the boundaries of the use of crowdsourcing in various parts of conducting research. Second, improving the method by applying it to new research contexts. Third, testing how changes in one dimension of the crowdsourcing models interact with other dimension. Fourth, comparing the quality criteria applied in this study to various other quality criteria to improve the method’s usefulness. Overall, this study represents a starting point for further development of the crowdsourced videographic research method.
Resumo:
When modeling machines in their natural working environment collisions become a very important feature in terms of simulation accuracy. By expanding the simulation to include the operation environment, the need for a general collision model that is able to handle a wide variety of cases has become central in the development of simulation environments. With the addition of the operating environment the challenges for the collision modeling method also change. More simultaneous contacts with more objects occur in more complicated situations. This means that the real-time requirement becomes more difficult to meet. Common problems in current collision modeling methods include for example dependency on the geometry shape or mesh density, calculation need increasing exponentially in respect to the number of contacts, the lack of a proper friction model and failures due to certain configurations like closed kinematic loops. All these problems mean that the current modeling methods will fail in certain situations. A method that would not fail in any situation is not very realistic but improvements can be made over the current methods.
Resumo:
The aim of this study was to compare two methods of tear sampling for protein quantification. Tear samples were collected from 29 healthy dogs (58 eyes) using Schirmer tear test (STT) strip and microcapillary tubes. The samples were frozen at -80ºC and analyzed by the Bradford method. Results were analyzed by Student's t test. The average protein concentration and standard deviation from tears collected with microcapillary tube were 4.45mg/mL ±0.35 and 4,52mg/mL ±0.29 for right and left eyes respectively. The average protein concentration and standard deviation from tears collected with Schirmer Tear Test (STT) strip were and 54.5mg/mL ±0.63 and 54.15mg/mL ±0.65 to right and left eyes respectively. Statistically significant differences (p<0.001) were found between the methods. In the conditions in which this study was conducted, the average protein concentration obtained with the Bradford test from tear samples obtained by Schirmer Tear Test (STT) strip showed values higher than those obtained with microcapillary tube. It is important that concentration of tear protein pattern values should be analyzed according the method used to collect tear samples.
Resumo:
The electrocardiography (ECG) QT interval is influenced by fluctuations in heart rate (HR) what may lead to misinterpretation of its length. Considering that alterations in QT interval length reflect abnormalities of the ventricular repolarisation which predispose to occurrence of arrhythmias, this variable must be properly evaluated. The aim of this work is to determine which method of correcting the QT interval is the most appropriate for dogs regarding different ranges of normal HR (different breeds). Healthy adult dogs (n=130; German Shepherd, Boxer, Pit Bull Terrier, and Poodle) were submitted to ECG examination and QT intervals were determined in triplicates from the bipolar limb II lead and corrected for the effects of HR through the application of three published formulae involving quadratic, cubic or linear regression. The mean corrected QT values (QTc) obtained using the diverse formulae were significantly different (ρ<0.05), while those derived according to the equation QTcV = QT + 0.087(1- RR) were the most consistent (linear regression). QTcV values were strongly correlated (r=0.83) with the QT interval and showed a coefficient of variation of 8.37% and a 95% confidence interval of 0.22-0.23 s. Owing to its simplicity and reliability, the QTcV was considered the most appropriate to be used for the correction of QT interval in dogs.
Resumo:
This study aims at standardizing the pre-incubation and incubation pH and temperature used in the metachromatic staining method of myofibrillar ATPase activity of myosin (mATPase) used for asses and mules. Twenty four donkeys and 10 mules, seven females and three males, were used in the study. From each animal, fragments from the Gluteus medius muscle were collected and percutaneous muscle biopsy was performed using a 6.0-mm Bergström-type needle. In addition to the metachromatic staining method of mATPase, the technique of nicotinamide adenine dinucleotide tetrazolium reductase (NADH-TR) was also performed to confirm the histochemical data. The histochemical result of mATPase for acidic pre-incubation (pH=4.50) and alkaline incubation (pH=10.50), at a temperature of 37ºC, yielded the best differentiation of fibers stained with toluidine blue. Muscle fibers were identified according to the following colors: type I (oxidative, light blue), type IIA (oxidative-glycolytic, intermediate blue) and type IIX (glycolytic, dark blue). There are no reports in the literature regarding the characterization and distribution of different types of muscle fibers used by donkeys and mules when performing traction work, cargo transportation, endurance sports (horseback riding) and marching competitions. Therefore, this study is the first report on the standardization of the mATPase technique for donkeys and mules.
Resumo:
Abstract: Fifty-five bursa of Fabricius (BF) were evaluated by optical microscopy for three different avian histopathologists (H1, H3 and H4) to determine the degree of lymphoid depletion. One histologist evaluated the same slides at two different times (H1 and H2) with four-months interval between the observations. The same BFs were evaluated using the system of Digital Lymphocyte Depletion Evaluation (ADDL), being performed by three differents operators of the system, not histopathologists. The results showed was a significant difference between the histopathologists and between the scores established by the same expert (H1 and H2). However, there were not significant differences between the scores with the ADDL system, obtained using ADDL. The results make clear the fragility of the subjective lymphocyte depletion score classification by the traditional histologic method, while the ADDL system proves to be more appropriated for the assessment of the lymphoid loss in the BF.
Resumo:
Abstract: Platelet-rich plasma (PRP) is a product easy and inxpesnsive, and stands out to for its growth factors in tissue repair. To obtain PRP, centrifugation of whole blood is made with specific time and gravitational forces. Thus, the present work aimed to study a method of double centrifugation to obtain PRP in order to evaluate the effective increase of platelet concentration in the final product, the preparation of PRP gel, and to optimize preparation time of the final sample. Fifteen female White New Zealand rabbits underwent blood sampling for the preparation of PRP. Samples were separated in two sterile tubes containing sodium citrate. Tubes were submitted to the double centrifugation protocol, with lid closed and 1600 revolutions per minute (rpm) for 10 minutes, resulting in the separation of red blood cells, plasma with platelets and leucocytes. After were opened and plasma was pipetted and transferred into another sterile tube. Plasma was centrifuged again at 2000rpm for 10 minutes; as a result it was split into two parts: on the top, consisting of platelet-poor plasma (PPP) and at the bottom of the platelet button. Part of the PPP was discarded so that only 1ml remained in the tube along with the platelet button. This material was gently agitated to promote platelets resuspension and activated when added 0.3ml of calcium gluconate, resulting in PRP gel. Double centrifugation protocol was able to make platelet concentration 3 times higher in relation to the initial blood sample. The volume of calcium gluconate used for platelet activation was 0.3ml, and was sufficient to coagulate the sample. Coagulation time ranged from 8 to 20 minutes, with an average of 17.6 minutes. Therefore, time of blood centrifugation until to obtain PRP gel took only 40 minutes. It was concluded that PRP was successfully obtained by double centrifugation protocol, which is able to increase the platelet concentration in the sample compared with whole blood, allowing its use in surgical procedures. Furthermore, the preparation time is appropriate to obtain PRP in just 40 minutes, and calcium gluconate is able to promote the activation of platelets.
Resumo:
By coupling the Boundary Element Method (BEM) and the Finite Element Method (FEM) an algorithm that combines the advantages of both numerical processes is developed. The main aim of the work concerns the time domain analysis of general three-dimensional wave propagation problems in elastic media. In addition, mathematical and numerical aspects of the related BE-, FE- and BE/FE-formulations are discussed. The coupling algorithm allows investigations of elastodynamic problems with a BE- and a FE-subdomain. In order to observe the performance of the coupling algorithm two problems are solved and their results compared to other numerical solutions.
Resumo:
The demand for more efficient manufacturing processes has been increasing in the last few years. The cold forging process is presented as a possible solution, because it allows the production of parts with a good surface finish and with good mechanical properties. Nevertheless, the cold forming sequence design is very empirical and it is based on the designer experience. The computational modeling of each forming process stage by the finite element method can make the sequence design faster and more efficient, decreasing the use of conventional "trial and error" methods. In this study, the application of a commercial general finite element software - ANSYS - has been applied to model a forming operation. Models have been developed to simulate the ring compression test and to simulate a basic forming operation (upsetting) that is applied in most of the cold forging parts sequences. The simulated upsetting operation is one stage of the automotive starter parts manufacturing process. Experiments have been done to obtain the stress-strain material curve, the material flow during the simulated stage, and the required forming force. These experiments provided results used as numerical model input data and as validation of model results. The comparison between experiments and numerical results confirms the developed methodology potential on die filling prediction.
Resumo:
The determination of the intersection curve between Bézier Surfaces may be seen as the composition of two separated problems: determining initial points and tracing the intersection curve from these points. The Bézier Surface is represented by a parametric function (polynomial with two variables) that maps a point in the tridimensional space from the bidimensional parametric space. In this article, it is proposed an algorithm to determine the initial points of the intersection curve of Bézier Surfaces, based on the solution of polynomial systems with the Projected Polyhedral Method, followed by a method for tracing the intersection curves (Marching Method with differential equations). In order to allow the use of the Projected Polyhedral Method, the equations of the system must be represented in terms of the Bernstein basis, and towards this goal it is proposed a robust and reliable algorithm to exactly transform a multivariable polynomial in terms of power basis to a polynomial written in terms of Bernstein basis .