946 resultados para solution-based DNA extraction
Resumo:
A new wavelet-based adaptive framework for solving population balance equations (PBEs) is proposed in this work. The technique is general, powerful and efficient without the need for prior assumptions about the characteristics of the processes. Because there are steeply varying number densities across a size range, a new strategy is developed to select the optimal order of resolution and the collocation points based on an interpolating wavelet transform (IWT). The proposed technique has been tested for size-independent agglomeration, agglomeration with a linear summation kernel and agglomeration with a nonlinear kernel. In all cases, the predicted and analytical particle size distributions (PSDs) are in excellent agreement. Further work on the solution of the general population balance equations with nucleation, growth and agglomeration and the solution of steady-state population balance equations will be presented in this framework. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Battery separators based on electrospun membranes of poly(vinylidene fluoride) (PVDF) have been prepared in order to study the effect of fiber alignment on the performance and characteristics of the membrane. The prepared membranes show an average fiber diameter of 272 nm and a degree of porosity of 87 %. The gel polymer electrolytes are prepared by soaking the membranes in the electrolyte solution. The alignment of the fibers improves the mechanical properties for the electrospun membranes. Further, the microstructure of the membrane also plays an important role in the ionic conductivity, being higher for the random electrospun membrane due to the lower tortuosity value. Independently of the microstructure, both membranes show good electrochemical stability up to 5.0 V versus Li/Li+. These results show that electrospun membranes based on PVDF are appropriate for battery separators in lithium-ion battery applications, the random membranes showing a better overall performance.
Resumo:
The relation between patient and physician in most modern Health Care Sys- tems is sparse, limited in time and very in exible. On the other hand, and in contradiction with several recent studies, most physicians do not rely their patient diagnostics evaluations on intertwined psychological and social nature factors. Facing these problems and trying to improve the patient/physician relation we present a mobile health care solution to im- prove the interaction between the physician and his patients. The solution serves not only as a privileged mean of communication between physicians and patients but also as an evolutionary intelligent platform delivering a mobile rule based system.
Resumo:
In the last years, it has become increasingly clear that neurodegenerative diseases involve protein aggregation, a process often used as disease progression readout and to develop therapeutic strategies. This work presents an image processing tool to automatic segment, classify and quantify these aggregates and the whole 3D body of the nematode Caenorhabditis Elegans. A total of 150 data set images, containing different slices, were captured with a confocal microscope from animals of distinct genetic conditions. Because of the animals’ transparency, most of the slices pixels appeared dark, hampering their body volume direct reconstruction. Therefore, for each data set, all slices were stacked in one single 2D image in order to determine a volume approximation. The gradient of this image was input to an anisotropic diffusion algorithm that uses the Tukey’s biweight as edge-stopping function. The image histogram median of this outcome was used to dynamically determine a thresholding level, which allows the determination of a smoothed exterior contour of the worm and the medial axis of the worm body from thinning its skeleton. Based on this exterior contour diameter and the medial animal axis, random 3D points were then calculated to produce a volume mesh approximation. The protein aggregations were subsequently segmented based on an iso-value and blended with the resulting volume mesh. The results obtained were consistent with qualitative observations in literature, allowing non-biased, reliable and high throughput protein aggregates quantification. This may lead to a significant improvement on neurodegenerative diseases treatment planning and interventions prevention
Resumo:
Nitrate losses from soil profiles by leaching should preferentially be monitored during high rainfall events and during irrigation when fertilizer nitrogen applications are elevated. Using a climatologic water balance, based on the models of Thornthwaite and Penman Monteith for potential evapotranspiration, drainage soil water fluxes below the root zone were estimated in a fertigated coffee crop. Soil solution extraction at the depth of 1 m allowed the calculation of nitrate leaching. The average nitrate concentration in soil solution for plots that received nitrogen by fertigation at a rate of 400 kg ha-1, was 5.42 mg L-1, surpassing the limit of the Brazilian legislation of 10.0 mg L-1, only during one month. For plots receiving 800 kg ha-1 of nitrogen, the average was 25.01 mg L-1, 2.5 times higher than the above-mentioned limit. This information indicates that nitrogen rates higher than 400 kg ha-1 are potentially polluting the ground water. Yearly nitrate amounts of leaching were 24.2 and 153.0 kg ha-1 for the nitrogen rates of 400 and 800 kg ha-1, respectively. The six times higher loss indicates a cost/benefit problem for coffee fertigations above 400 kg ha-1.
Resumo:
The modelling of the experimental data of the extraction of the volatile oil from six aromatic plants (coriander, fennel, savoury, winter savoury, cotton lavender and thyme) was performed using five mathematical models, based on differential mass balances. In all cases the extraction was internal diffusion controlled and the internal mass transfer coefficienty (k(s)) have been found to change with pressure, temperature and particle size. For fennel, savoury and cotton lavender, the external mass transfer and the equilibrium phase also influenced the second extraction period, since k(s) changed with the tested flow rates. In general, the axial dispersion coefficient could be neglected for the conditions studied, since Peclet numbers were high. On the other hand, the solute-matrix interaction had to be considered in order to ensure a satisfactory description of the experimental data.
Resumo:
Environmental tobacco smoke (ETS) is recognized as an occupational hazard in the hospitality industry. Although Portuguese legislation banned smoking in most indoor public spaces, it is still allowed in some restaurants/bars, representing a potential risk to the workers’ health, particularly for chronic respiratory diseases. The aims of this work were to characterize biomarkers of early genetic effects and to disclose proteomic signatures associated to occupational exposure to ETS and with potential to predict respiratory diseases development. A detailed lifestyle survey and clinical evaluation (including spirometry) were performed in 81 workers from Lisbon restaurants. ETS exposure was assessed through the level of PM 2.5 in indoor air and the urinary level of cotinine. The plasma samples were immunodepleted and analysed by 2D-SDSPAGE followed by in-gel digestion and LC-MS/MS. DNA lesions and chromosome damage were analysed innlymphocytes and in exfoliated buccal cells from 19 cigarette smokers, 29 involuntary smokers, and 33 non-smokers not exposed to tobacco smoke. Also, the DNA repair capacity was evaluated using an ex vivo challenge comet assay with an alkylating agent (EMS). All workers were considered healthy and recorded normal lung function. Interestingly, following 2D-DIGE-MS (MALDI-TOF/TOF), 61 plasma proteins were found differentially expressed in ETS-exposed subjects, including 38 involved in metabolism, acute-phase respiratory inflammation, and immune or vascular functions. On the other hand, the involuntary smokers showed neither an increased level of DNA/chromosome damage on lymphocytes nor an increased number of micronuclei in buccal cells, when compared to non-exposed non-smokers. Noteworthy, lymphocytes challenge with EMS resulted in a significantly lower level of DNA breaks in ETS-exposed as compared to non-exposed workers (P<0.0001) suggestive of an adaptive response elicited by the previous exposure to low levels of ETS. Overall, changes in proteome may be promising early biomarkers of exposure to ETS. Likewise, alterations of the DNA repair competence observed upon ETS exposure deserves to be further understood. Work supported by Fundação Calouste Gulbenkian, ACSS and FCT/Polyannual Funding Program.
Resumo:
Tese de Doutoramento em Biologia apresentada à Faculdade de Ciências da Universidade do Porto, 2015.
Resumo:
OBJECTIVE: A cohort study has been designed to identify predictors of adverse health events in the elderly. The methodology of the study and preliminary descriptive results are presented. METHODS: The study population comprises all residents of Bambuí (Minas Gerais, Brazil), aged 60 or more years (n=1.742). From these, 92.2% were interviewed and 85.9% underwent clinical examination, consisting of haematological and biochemical tests, serology for Trypanosoma cruzi, anthropometric and blood pressure measures and electrocardiogram. Aliquots of serum, plasma and DNA were stored for future investigations. The baseline interview included sociodemographic characteristics, self-referred health condition and history of selected diseases, medication use, health service use, source of medical care, physical activities, smoking, drinking and eating habits, reproductive history, physical functioning, life events, social support and mental health. Individuals are being followed up annually. RESULTS: The following characteristics predominated among participants: women (60,0%), married (48.9%) or widowed (35.4%), people living in households with up to 2 residents (73.8%), heads of family (76.7%), people with monthly income between 1.00 and 2.99 Brazilian minimum wages (62.0%) and people with up to 4 years of schooling (89.1%). The median age was 68 years. Among the cohort members, only 1.7% were lost in the first follow-up. CONCLUSIONS: In general, the characteristics of the study population were very similar to those from other epidemiological studies of the elderly based on large Brazilian cities. The small number of losses to follow-up indicates that the choice of Bambuí was adequate, assuring the feasibility of a long term cohort study.
Resumo:
Background: With the decrease of DNA sequencing costs, sequence-based typing methods are rapidly becoming the gold standard for epidemiological surveillance. These methods provide reproducible and comparable results needed for a global scale bacterial population analysis, while retaining their usefulness for local epidemiological surveys. Online databases that collect the generated allelic profiles and associated epidemiological data are available but this wealth of data remains underused and are frequently poorly annotated since no user-friendly tool exists to analyze and explore it. Results: PHYLOViZ is platform independent Java software that allows the integrated analysis of sequence-based typing methods, including SNP data generated from whole genome sequence approaches, and associated epidemiological data. goeBURST and its Minimum Spanning Tree expansion are used for visualizing the possible evolutionary relationships between isolates. The results can be displayed as an annotated graph overlaying the query results of any other epidemiological data available. Conclusions: PHYLOViZ is a user-friendly software that allows the combined analysis of multiple data sources for microbial epidemiological and population studies. It is freely available at http://www.phyloviz.net.
Resumo:
Metaheuristics performance is highly dependent of the respective parameters which need to be tuned. Parameter tuning may allow a larger flexibility and robustness but requires a careful initialization. The process of defining which parameters setting should be used is not obvious. The values for parameters depend mainly on the problem, the instance to be solved, the search time available to spend in solving the problem, and the required quality of solution. This paper presents a learning module proposal for an autonomous parameterization of Metaheuristics, integrated on a Multi-Agent System for the resolution of Dynamic Scheduling problems. The proposed learning module is inspired on Autonomic Computing Self-Optimization concept, defining that systems must continuously and proactively improve their performance. For the learning implementation it is used Case-based Reasoning, which uses previous similar data to solve new cases. In the use of Case-based Reasoning it is assumed that similar cases have similar solutions. After a literature review on topics used, both AutoDynAgents system and Self-Optimization module are described. Finally, a computational study is presented where the proposed module is evaluated, obtained results are compared with previous ones, some conclusions are reached, and some future work is referred. It is expected that this proposal can be a great contribution for the self-parameterization of Metaheuristics and for the resolution of scheduling problems on dynamic environments.
Resumo:
This paper presents a Swarm based Cooperation Mechanism for scheduling optimization. We intend to conceptualize real manufacturing systems as interacting autonomous entities in order to support decision making in agile manufacturing environments. Agents coordinate their actions automatically without human supervision considering a common objective – global scheduling solution taking advantages from collective behavior of species through implicit and explicit cooperation. The performance of the cooperation mechanism will be evaluated consider implicit cooperation at first stage through ACS, PSO and ABC algorithms and explicit through cooperation mechanism application.
Resumo:
With the electricity market liberalization, distribution and retail companies are looking for better market strategies based on adequate information upon the consumption patterns of its electricity customers. In this environment all consumers are free to choose their electricity supplier. A fair insight on the customer´s behaviour will permit the definition of specific contract aspects based on the different consumption patterns. In this paper Data Mining (DM) techniques are applied to electricity consumption data from a utility client’s database. To form the different customer´s classes, and find a set of representative consumption patterns, we have used the Two-Step algorithm which is a hierarchical clustering algorithm. Each consumer class will be represented by its load profile resulting from the clustering operation. Next, to characterize each consumer class a classification model will be constructed with the C5.0 classification algorithm.
Resumo:
This work describes a methodology to extract symbolic rules from trained neural networks. In our approach, patterns on the network are codified using formulas on a Lukasiewicz logic. For this we take advantage of the fact that every connective in this multi-valued logic can be evaluated by a neuron in an artificial network having, by activation function the identity truncated to zero and one. This fact simplifies symbolic rule extraction and allows the easy injection of formulas into a network architecture. We trained this type of neural network using a back-propagation algorithm based on Levenderg-Marquardt algorithm, where in each learning iteration, we restricted the knowledge dissemination in the network structure. This makes the descriptive power of produced neural networks similar to the descriptive power of Lukasiewicz logic language, minimizing the information loss on the translation between connectionist and symbolic structures. To avoid redundance on the generated network, the method simplifies them in a pruning phase, using the "Optimal Brain Surgeon" algorithm. We tested this method on the task of finding the formula used on the generation of a given truth table. For real data tests, we selected the Mushrooms data set, available on the UCI Machine Learning Repository.
Resumo:
Congestion management of transmission power systems has achieve high relevance in competitive environments, which require an adequate approach both in technical and economic terms. This paper proposes a new methodology for congestion management and transmission tariff determination in deregulated electricity markets. The congestion management methodology is based on a reformulated optimal power flow, whose main goal is to obtain a feasible solution for the re-dispatch minimizing the changes in the transactions resulting from market operation. The proposed transmission tariffs consider the physical impact caused by each market agents in the transmission network. The final tariff considers existing system costs and also costs due to the initial congestion situation and losses. This paper includes a case study for the 118 bus IEEE test case.