952 resultados para Simplified procedure
Resumo:
The search for new renewable materials has intensified in recent years. Pulp and paper mill process streams contain a number of potential compounds which could be used in biofuel production and as raw materials in the chemical, food and pharmaceutical industries. Prior to utilization, these compounds require separation from other compounds present in the process stream. One feasible separation technique is membrane filtration but to some extent, fouling still limits its implementation in pulp and paper mill applications. To mitigate fouling and its effects, foulants and their fouling mechanisms need to be well understood. This thesis evaluates fouling in filtration of pulp and paper mill process streams by means of polysaccharide model substance filtrations and by development of a procedure to analyze and identify potential foulants, i.e. wood extractives and carbohydrates, from fouled membranes. The model solution filtration results demonstrate that each polysaccharide has its own fouling mechanism, which also depends on the membrane characteristics. Polysaccharides may foul the membranes by adsorption and/or by gel/cake layer formation on the membrane surface. Moreover, the polysaccharides interact, which makes fouling evaluation of certain compound groups very challenging. Novel methods to identify wood extractive and polysaccharide foulants are developed in this thesis. The results show that it is possible to extract and identify wood extractives from membranes fouled in filtration of pulp and paper millstreams. The most effective solvent was found to be acetone:water (9:1 v/v) because it extracted both lipophilic extractives and lignans at high amounts from the fouled membranes and it was also non-destructive for the membrane materials. One hour of extraction was enough to extract wood extractives at high amounts for membrane samples with an area of 0.008 m2. If only qualitative knowledge of wood extractives is needed a simplified extraction procedure can be used. Adsorption was the main fouling mechanism in extractives-induced fouling and dissolved fatty and resin acids were mostly the reason for the fouling; colloidal fouling was negligible. Both process water and membrane characteristics affected extractives-induced fouling. In general, the more hydrophilic regenerated cellulose (RC) membrane fouled less that the more hydrophobic polyethersulfone (PES) and polyamide (PA) membranes independent of the process water used. Monosaccharide and uronic acid units could also be identified from the fouled synthetic polymeric membranes. It was impossible to analyze all monosaccharide units from the RC membrane because the analysis result obtained contained degraded membrane material. One of the fouling mechanisms of carbohydrates was adsorption. Carbohydrates were not potential adsorptive foulants to the sameextent as wood extractives because their amount in the fouled membranes was found to be significantly lower than the amount of wood extractives.
Resumo:
The numerous methods for calculating the potential or reference evapotranspiration (ETo or ETP) almost always do for a 24-hour period, including values of climatic parameters throughout the nocturnal period (daily averages). These results have a nil effect on transpiration, constituting the main evaporative demand process in cases of localized irrigation. The aim of the current manuscript was to come up with a model rather simplified for the calculation of diurnal daily ETo. It deals with an alternative approach based on the theoretical background of the Penman method without having to consider values of aerodynamic conductance of latent and sensible heat fluxes, as well as data of wind speed and relative humidity of the air. The comparison between the diurnal values of ETo measured in weighing lysimeters with elevated precision and estimated by either the Penman-Monteith method or the Simplified-Penman approach in study also points out a fairly consistent agreement among the potential demand calculation criteria. The Simplified-Penman approach was a feasible alternative to estimate ETo under the local meteorological conditions of two field trials. With the availability of the input data required, such a method could be employed in other climatic regions for scheduling irrigation.
Resumo:
The aim of this study is to develop a suitable project control procedure for a target company that can be used in engineering, procurement, and construction or con-struction management contracts. This procedure contains suitable project control software and a model for the use of the software in practice. This study is divided into two main sections. Theoretical part deals with project management, focusing on cost and time dimensions in projects. Empirical part deals with the development of the project control procedure for the target compa-ny. This development takes place in two parts. In the first part, semi-structured interviews are used to find out the company’s employees’ demands and desires for the project control software which will then be used in the developed procedure. These demands and desires are compared to available software in the market and the most suitable one will be chosen. Interview results show that important factors are cost tracking, integration with other software, English language availability, references, helpdesk, and no need for regular updates. The most suitable one is CMPro5 cost control software. The chosen software is used in a pilot project, where its functions and use are analyzed. Project control procedure which will be used in the future is developed based on these procedures. The five steps in developed procedure include employment of a cost engineer, whose task is to maintain the procedure in the target company.
Resumo:
Objective: To evaluate the effectiveness and safety of correction of pectus excavatum by the Nuss technique based on the available scientific evidence.Methods: We conducted an evidence synthesis following systematic processes of search, selection, extraction and critical appraisal. Outcomes were classified by importance and had their quality assessed by the Grading of Recommendations Assessment, Development and Evaluation (GRADE).Results: The process of selection of items led to the inclusion of only one systematic review, which synthesized the results of nine observational studies comparing the Nuss and Ravitch procedures. The evidence found was rated as poor and very poor quality. The Nuss procedure has increased the incidence of hemothorax (RR = 5.15; 95% CI: 1.07; 24.89), pneumothorax (RR = 5.26; 95% CI: 1.55; 17.92) and the need for reintervention (RR = 4.88; 95% CI: 2.41; 9.88) when compared to the Ravitch. There was no statistical difference between the two procedures in outcomes: general complications, blood transfusion, hospital stay and time to ambulation. The Nuss operation was faster than the Ravitch (mean difference [MD] = -69.94 minutes, 95% CI: -139.04, -0.83).Conclusion: In the absence of well-designed prospective studies to clarify the evidence, especially in terms of aesthetics and quality of life, surgical indication should be individualized and the choice of the technique based on patient preference and experience of the team.
Resumo:
Non-linear functional representation of the aerodynamic response provides a convenient mathematical model for motion-induced unsteady transonic aerodynamic loads response, that accounts for both complex non-linearities and time-history effects. A recent development, based on functional approximation theory, has established a novel functional form; namely, the multi-layer functional. For a large class of non-linear dynamic systems, such multi-layer functional representations can be realised via finite impulse response (FIR) neural networks. Identification of an appropriate FIR neural network model is facilitated by means of a supervised training process in which a limited sample of system input-output data sets is presented to the temporal neural network. The present work describes a procedure for the systematic identification of parameterised neural network models of motion-induced unsteady transonic aerodynamic loads response. The training process is based on a conventional genetic algorithm to optimise the network architecture, combined with a simplified random search algorithm to update weight and bias values. Application of the scheme to representative transonic aerodynamic loads response data for a bidimensional airfoil executing finite-amplitude motion in transonic flow is used to demonstrate the feasibility of the approach. The approach is shown to furnish a satisfactory generalisation property to different motion histories over a range of Mach numbers in the transonic regime.
Resumo:
The progressive behavior of the blood pressure of term newborns during the first week of life was assessed by the simultaneous use of oscillometric and Doppler methods. A total of 174 term neonates born at the Municipal Hospital Odilon Behrens in Belo Horizonte, from March 1996 to February 1997, were prospectively assessed. The oscillometric and Doppler ultrasonic methods were simultaneously used for four consecutive recordings obtained at 12 ± 6, 24 ± 6 and 72 ± 24 h and on the 7th ± 1 day of life. The combined use of the two methods simplified the procedure, with automatic cuff inflation and deflation, and speed was properly controlled with an automatic pressure monitor. The procedure was performed using a Y-connection to the mercury sphygmomanometer, with blood pressure being recorded with an automatic device and systolic blood pressure being measured simultaneously by Doppler ultrasound. The newborns were awake, not crying and in the supine position. A statistically significant increase in systolic and diastolic blood pressure was observed between the first and second, and the third and fourth measurements by Doppler and oscillometric methods. No significant correlation between birth weight, length, ponderal index and blood pressure was observed. The technique used represents a simpler and more accurate procedure for blood pressure measurement.
Resumo:
The objective of the present study was to develop a simplified low cost method for the collection and fixation of pediatric autopsy cells and to determine the quantitative and qualitative adequacy of extracted DNA. Touch and scrape preparations of pediatric liver cells were obtained from 15 cadavers at autopsy and fixed in 95% ethanol or 3:1 methanol:acetic acid. Material prepared by each fixation procedure was submitted to DNA extraction with the Wizard® genomic DNA purification kit for DNA quantification and five of the preparations were amplified by multiplex PCR (azoospermia factor genes). The amount of DNA extracted varied from 20 to 8,640 µg, with significant differences between fixation methods. Scrape preparation fixed in 95% ethanol provided larger amount of extracted DNA. However, the mean for all groups was higher than the quantity needed for PCR (50 ng) or Southern blot (500 ng). There were no qualitative differences among the different material and fixatives. The same results were also obtained for glass slides stored at room temperature for 6, 12, 18 and 24 months. We conclude that touch and scrape preparations fixed in 95% ethanol are a good source of DNA and present fewer limitations than cell culture, tissue paraffin embedding or freezing that require sterile material, culture medium, laboratory equipment and trained technicians. In addition, they are more practical and less labor intensive and can be obtained and stored for a long time at low cost.
Resumo:
In order to detect several new HLA-A class I alleles that have been described since 1998, the original PCR-RFLP method developed to identify the 78 alleles recognized at that time at high resolution level was adapted by us for low and medium resolution levels using a nested PCR-RFLP approach. The results obtained from blood samples of 23 subjects using both the PCR-RFLP method and a commercial kit (MicroSSP1A®, One Lambda Inc.) showed an agreement higher than 95%. The PCR-RFLP adapted method was effective in low and medium resolution histocompatibility evaluations.
Resumo:
Tässä diplomityössä käsitellään sorvauksen työstövärähtelyjen ja sorvin keskiökärjen rakenteen yhteyttä. Työ on osa Lappeenrannan teknillisen yliopiston VMAX-projektia, ja sen taustalla on pyrkimys uudenlaisen, sorvin kärkipylkän puristusvoiman ajonaikaiseen säätämiseen perustuvan työstövärähtelyjen välttämismenetelmän kehittämiseen. Tämän menetelmän toiminnan todentaminen oli työn ensimmäinen tavoite. Menetelmän toteuttaminen asettaa kuitenkin käytetyn keskiökärjen rakenteelle tiettyjä vaatimuksia. Työn toisena tavoitteena oli nämä vaatimukset täyttävän keskiökärjen prototyypin kehittäminen. Tutkimus eteni seuraavasti. Ensimmäiseksi ongelma määriteltiin tutustumalla työn teoreettiseen taustaan ja aiheeseen liittyvään tutkimukseen Lappeenrannan teknillisestä yliopistosta ja muualta. Myös keskiökärkiä valmistavien yritysten tuotekatalogeja tarkasteltiin. Seuraavaksi siirryttiin alustavaan suunnitteluvaiheeseen, jossa verifioitiin menetelmän toiminta ja luotiin konsepteja keskiökärjen rakenteen kehittämistä varten. Tämän alustavan vaiheen jälkeen suoritettiin suunnitteluprosessi keskiökärjen prototyypille. Lopuksi, suunnitellun prototyypin rakenteen käyttäytymistä arvioitiin tietokonemallinnuksen avulla. Lisätuloksena tutkimuksen aikana johdettiin yksinkertaistettu elementtimenetelmään perustuva laskentamalli järjestelmän ominaistaajuuksien selvittämiseksi. Laskentamallin tarkkuutta arvoitiin. Suunnitteluprosessin tuloksena saatiin kaikki menetelmän toiminnan sekä normaalin käytön asettamat vaatimukset täyttävä rakenne keskiökärjen prototyypille. Myös johdetun laskentamallin tulokset ovat varsin lähellä 3D-elementtimallinnuksen antamia tuloksia. Tutkimuksen tavoitteiden voidaan siis sanoa toteutuneen. Koska prototyyppiä ja laskentamallia ei kuitenkaan ole vielä kokeellisesti verifioitu, tämä ei ole täysin varmaa.
Resumo:
Transtracheal puncture has long been known as a safe, low-cost procedure. However, with the advent of bronchoscopy, it has largely been forgotten. Two researchers have suggested the use of α-amylase activity to diagnose salivary aspiration, but the normal values of this enzyme in tracheobronchial secretions are unknown. We aimed to define the normal values of α-amylase activity in tracheobronchial secretions and verify the rate of major complications of transtracheal puncture. From October 2009 to June 2011, we prospectively evaluated 118 patients without clinical or radiological signs of salivary aspiration who underwent transtracheal puncture before bronchoscopy. The patients were sedated with a solution of lidocaine and diazepam until they reached a Ramsay sedation score of 2 or 3. We then cleaned the cervical region and anesthetized the superficial planes with lidocaine. Next, we injected 10 mL of 2% lidocaine into the tracheobronchial tree. Finally, we injected 10 mL of normal saline into the tracheobronchial tree and immediately aspirated the saline with maximum vacuum pressure to collect samples for measurement of the α-amylase level. The α-amylase level mean ± SE, median, and range were 1914 ± 240, 1056, and 24-10,000 IU/L, respectively. No major complications (peripheral desaturation, subcutaneous emphysema, cardiac arrhythmia, or hemoptysis) occurred among 118 patients who underwent this procedure. Transtracheal aspiration is a safe, low-cost procedure. We herein define for the first time the normal α-amylase levels in the tracheobronchial secretions of humans.
Resumo:
Yerba maté extracts have in vitro antioxidant capacity attributed to the presence of polyphenolic compounds, mainly chlorogenic acids and dicaffeoylquinic acid derivatives. DPPH is one of the most used assays to measure the antioxidant capacity of pure compounds and plant extracts. It is difficult to compare the results between studies because this assay is applied in too many different conditions by the different research groups. Thus, in order to assess the antioxidant capacity of yerba maté extracts, the following procedure is proposed: 100 µL of an aqueous dilution of the extracts is mixed in duplicate with 3.0 mL of a DPPH 'work solution in absolute methanol (100 µM.L-1), with an incubation time of 120 minutes in darkness at 37 ± 1 °C, and then absorbance is read at 517 nm against absolute methanol. The results should be expressed as ascorbic acid equivalents or Trolox equivalents in mass percentage (g% dm, dry matter) in order to facilitate comparisons. The AOC of the ethanolic extracts ranged between 12.8 and 23.1 g TE % dm and from 9.1 to 16.4 g AAE % dm. The AOC determined by the DPPH assay proposed in the present study can be related to the total polyphenolic content determined by the Folin-Ciocalteu assay.
Resumo:
UANL
Resumo:
Cette thèse étudie une approche intégrant la gestion de l’horaire et la conception de réseaux de services pour le transport ferroviaire de marchandises. Le transport par rail s’articule autour d’une structure à deux niveaux de consolidation où l’affectation des wagons aux blocs ainsi que des blocs aux services représentent des décisions qui complexifient grandement la gestion des opérations. Dans cette thèse, les deux processus de consolidation ainsi que l’horaire d’exploitation sont étudiés simultanément. La résolution de ce problème permet d’identifier un plan d’exploitation rentable comprenant les politiques de blocage, le routage et l’horaire des trains, de même que l’habillage ainsi que l’affectation du traffic. Afin de décrire les différentes activités ferroviaires au niveau tactique, nous étendons le réseau physique et construisons une structure de réseau espace-temps comprenant trois couches dans lequel la dimension liée au temps prend en considération les impacts temporels sur les opérations. De plus, les opérations relatives aux trains, blocs et wagons sont décrites par différentes couches. Sur la base de cette structure de réseau, nous modélisons ce problème de planification ferroviaire comme un problème de conception de réseaux de services. Le modèle proposé se formule comme un programme mathématique en variables mixtes. Ce dernie r s’avère très difficile à résoudre en raison de la grande taille des instances traitées et de sa complexité intrinsèque. Trois versions sont étudiées : le modèle simplifié (comprenant des services directs uniquement), le modèle complet (comprenant des services directs et multi-arrêts), ainsi qu’un modèle complet à très grande échelle. Plusieurs heuristiques sont développées afin d’obtenir de bonnes solutions en des temps de calcul raisonnables. Premièrement, un cas particulier avec services directs est analysé. En considérant une cara ctéristique spécifique du problème de conception de réseaux de services directs nous développons un nouvel algorithme de recherche avec tabous. Un voisinage par cycles est privilégié à cet effet. Celui-ci est basé sur la distribution du flot circulant sur les blocs selon les cycles issus du réseau résiduel. Un algorithme basé sur l’ajustement de pente est développé pour le modèle complet, et nous proposons une nouvelle méthode, appelée recherche ellipsoidale, permettant d’améliorer davantage la qualité de la solution. La recherche ellipsoidale combine les bonnes solutions admissibles générées par l’algorithme d’ajustement de pente, et regroupe les caractéristiques des bonnes solutions afin de créer un problème élite qui est résolu de facon exacte à l’aide d’un logiciel commercial. L’heuristique tire donc avantage de la vitesse de convergence de l’algorithme d’ajustement de pente et de la qualité de solution de la recherche ellipsoidale. Les tests numériques illustrent l’efficacité de l’heuristique proposée. En outre, l’algorithme représente une alternative intéressante afin de résoudre le problème simplifié. Enfin, nous étudions le modèle complet à très grande échelle. Une heuristique hybride est développée en intégrant les idées de l’algorithme précédemment décrit et la génération de colonnes. Nous proposons une nouvelle procédure d’ajustement de pente où, par rapport à l’ancienne, seule l’approximation des couts liés aux services est considérée. La nouvelle approche d’ajustement de pente sépare ainsi les décisions associées aux blocs et aux services afin de fournir une décomposition naturelle du problème. Les résultats numériques obtenus montrent que l’algorithme est en mesure d’identifier des solutions de qualité dans un contexte visant la résolution d’instances réelles.