999 resultados para Enrichment methods
Resumo:
The Electrohysterogram (EHG) is a new instrument for pregnancy monitoring. It measures the uterine muscle electrical signal, which is closely related with uterine contractions. The EHG is described as a viable alternative and a more precise instrument than the currently most widely used method for the description of uterine contractions: the external tocogram. The EHG has also been indicated as a promising tool in the assessment of preterm delivery risk. This work intends to contribute towards the EHG characterization through the inventory of its components which are: • Contractions; • Labor contractions; • Alvarez waves; • Fetal movements; • Long Duration Low Frequency Waves; The instruments used for cataloging were: Spectral Analysis, parametric and non-parametric, energy estimators, time-frequency methods and the tocogram annotated by expert physicians. The EHG and respective tocograms were obtained from the Icelandic 16-electrode Electrohysterogram Database. 288 components were classified. There is not a component database of this type available for consultation. The spectral analysis module and power estimation was added to Uterine Explorer, an EHG analysis software developed in FCT-UNL. The importance of this component database is related to the need to improve the understanding of the EHG which is a relatively complex signal, as well as contributing towards the detection of preterm birth. Preterm birth accounts for 10% of all births and is one of the most relevant obstetric conditions. Despite the technological and scientific advances in perinatal medicine, in developed countries, prematurity is the major cause of neonatal death. Although various risk factors such as previous preterm births, infection, uterine malformations, multiple gestation and short uterine cervix in second trimester, have been associated with this condition, its etiology remains unknown [1][2][3].
Resumo:
Based in internet growth, through semantic web, together with communication speed improvement and fast development of storage device sizes, data and information volume rises considerably every day. Because of this, in the last few years there has been a growing interest in structures for formal representation with suitable characteristics, such as the possibility to organize data and information, as well as the reuse of its contents aimed for the generation of new knowledge. Controlled Vocabulary, specifically Ontologies, present themselves in the lead as one of such structures of representation with high potential. Not only allow for data representation, as well as the reuse of such data for knowledge extraction, coupled with its subsequent storage through not so complex formalisms. However, for the purpose of assuring that ontology knowledge is always up to date, they need maintenance. Ontology Learning is an area which studies the details of update and maintenance of ontologies. It is worth noting that relevant literature already presents first results on automatic maintenance of ontologies, but still in a very early stage. Human-based processes are still the current way to update and maintain an ontology, which turns this into a cumbersome task. The generation of new knowledge aimed for ontology growth can be done based in Data Mining techniques, which is an area that studies techniques for data processing, pattern discovery and knowledge extraction in IT systems. This work aims at proposing a novel semi-automatic method for knowledge extraction from unstructured data sources, using Data Mining techniques, namely through pattern discovery, focused in improving the precision of concept and its semantic relations present in an ontology. In order to verify the applicability of the proposed method, a proof of concept was developed, presenting its results, which were applied in building and construction sector.
Resumo:
Short-bowel syndrome is responsible for significant metabolic alterations that compromise nutritional status. Glutamine is considered an essential nutrient for enterocytes, so beneficial effects from supplementation of the diet with glutamine are hypothesized. PURPOSE: In this study, the effect of a diet enriched with glutamine was evaluated in rats undergoing extensive small bowel resection, with analysis of postoperative weight loss and intestinal morphometrics of villi height, crypt depth, and thickness of the duodenal and remnant jejunal mucosa. METHODS: Three groups of male Wistar rats were established receiving the following diets: with glutamine, without glutamine, and the standard diet of laboratory ration. All animals underwent an extensive small bowel resection, including the ileocecal valve, leaving a remnant jejunum of only 25 cm from the pylorus that was anastomosed lateral-laterally to the ascendant colon. The animals were weighed at the beginning and end of the experiment (20th postoperative day). Then they were killed and the remnant intestine was removed. Fragments of duodenal and jejunal mucosa were collected from the remnant intestine and submitted to histopathologic exam. The morphometric study of the intestinal mucosa was accomplished using a digital system (KS 300) connected to an optic microscope. Morphometrics included villi height, crypt depth, and the total thickness of intestinal mucosa. RESULTS: The weight loss comparison among the 3 groups showed no significant loss difference. The morphometric studies showed significantly taller duodenal villi in the glutamine group in comparison to the without glutamine group, but not different from the standard diet group. The measurements obtained comparing the 3 groups for villi height, crypt depth, and thickness of the remnant jejunum mucosa were greater in the glutamine-enriched diet group than for the without-glutamine diet group, though not significantly different from with standard-diet group. CONCLUSIONS: In rats with experimentally produced short-bowel syndrome, glutamine-enrichment of an isonitrogenous test diet was associated with an improved adaptation response by the intestinal mucosa but not reduced weight loss. However, the adaptation response in the group receiving the glutamine-enriched diet was not improved over that for the group fed regular chow.
Resumo:
Grasslands in semi-arid regions, like Mongolian steppes, are facing desertification and degradation processes, due to climate change. Mongolia’s main economic activity consists on an extensive livestock production and, therefore, it is a concerning matter for the decision makers. Remote sensing and Geographic Information Systems provide the tools for advanced ecosystem management and have been widely used for monitoring and management of pasture resources. This study investigates which is the higher thematic detail that is possible to achieve through remote sensing, to map the steppe vegetation, using medium resolution earth observation imagery in three districts (soums) of Mongolia: Dzag, Buutsagaan and Khureemaral. After considering different thematic levels of detail for classifying the steppe vegetation, the existent pasture types within the steppe were chosen to be mapped. In order to investigate which combination of data sets yields the best results and which classification algorithm is more suitable for incorporating these data sets, a comparison between different classification methods were tested for the study area. Sixteen classifications were performed using different combinations of estimators, Landsat-8 (spectral bands and Landsat-8 NDVI-derived) and geophysical data (elevation, mean annual precipitation and mean annual temperature) using two classification algorithms, maximum likelihood and decision tree. Results showed that the best performing model was the one that incorporated Landsat-8 bands with mean annual precipitation and mean annual temperature (Model 13), using the decision tree. For maximum likelihood, the model that incorporated Landsat-8 bands with mean annual precipitation (Model 5) and the one that incorporated Landsat-8 bands with mean annual precipitation and mean annual temperature (Model 13), achieved the higher accuracies for this algorithm. The decision tree models consistently outperformed the maximum likelihood ones.
Resumo:
Since the last decade of the twentieth century, the healthcare industry is paying attention to the environmental impact of their buildings and therefore new regulations, policy goals and Buildings Sustainability Assessment (HBSA) methods are being developed and implemented. At the present, healthcare is one of the most regulated industries and it is also one of the largest consumers of energy per net floor area. To assess the sustainability of healthcare buildings it is necessary to establish a set of benchmarks related with their life-cycle performance. They are both essential to rate the sustainability of a project and to support designers and other stakeholders in the process of designing and operating a sustainable building, by allowing the comparison to be made between a project and the conventional and best market practices. This research is focused on the methodology to set the benchmarks for resources consumption, waste production, operation costs and potential environmental impacts related to the operational phase of healthcare buildings. It aims at contributing to the reduction of the subjectivity found in the definition of the benchmarks used in Building Sustainability Assessment (BSA) methods, and it is applied in the Portuguese context. These benchmarks will be used in the development of a Portuguese HBSA method.
Resumo:
As a renewable energy source, the use of forest biomass for electricity generation is advantageous in comparison with fossil fuels, however the activity of forest biomass power plants causes adverse impacts, affecting particularly neighbouring communities. The main objective of this study is to estimate the effects of the activity of forest biomass power plants on the welfare of two groups of stakeholders, namely local residents and the general population and we apply two stated preference methods: contingent valuation and discrete choice experiments, respectively. The former method was applied to estimate the minimum compensation residents of neighbouring communities of two forest biomass power plants in Portugal would be willing to accept. The latter method was applied among the general population to estimate their willingness to pay to avoid specific environmental impacts. The results show that the presence of the selected facilities affects individuals’ well-being. On the other hand, in the discrete choice experiments conducted among the general population all impacts considered were significant determinants of respondents’ welfare levels. The results of this study stress the importance of performing an equity analysis of the welfare effects on different groups of stakeholders from the installation of forest biomass power plants, as their effects on welfare are location and impact specific. Policy makers should take into account the views of all stakeholders either directly or indirectly involved when deciding crucial issues regarding the sitting of new forest biomass power plants, in order to achieve an efficient and equitable outcome.
Resumo:
To solve a health and safety problem on a waste treatment facility, different multicriteria decision methods were used, including the PROV Exponential decision method. Four alternatives and ten attributes were considered. We found a congruent solution, validated by the different methods. The AHP and the PROV Exponential decision method led us to the same options ordering, but the last method reinforced one of the options as being the best performing one, and detached the least performing option. Also, the ELECTRE I method results led to the same ordering which allowed to point the best solution with reasonable confidence. This paper demonstrates the potential of using multicriteria decision methods to support decision making on complex problems such as risk control and accidents prevention.
Resumo:
Polymer binder modification with inorganic nanomaterials (NM) could be a potential and efficient solution to control matrix flammability of polymer concrete (PC) materials without sacrificing other important properties. Occupational exposures can occur all along the life cycle of a NM and “nanoproducts” from research through scale-up, product development, manufacturing, and end of life. The main objective of the present study is to analyse and compare different qualitative risk assessment methods during the production of polymer mortars (PM) with NM. The laboratory scale production process was divided in 3 main phases (pre-production, production and post-production), which allow testing the assessment methods in different situations. The risk assessment involved in the manufacturing process of PM was made by using the qualitative analyses based on: French Agency for Food, Environmental and Occupational Health & Safety method (ANSES); Control Banding Nanotool (CB Nanotool); Ecole Polytechnique Fédérale de Lausanne method (EPFL); Guidance working safely with nanomaterials and nanoproducts (GWSNN); Istituto Superiore per la Prevenzione e la Sicurezza del Lavoro, Italy method (ISPESL); Precautionary Matrix for Synthetic Nanomaterials (PMSN); and Stoffenmanager Nano. It was verified that the different methods applied also produce different final results. In phases 1 and 3 the risk assessment tends to be classified as medium-high risk, while for phase 2 the more common result is medium level. It is necessary to improve the use of qualitative methods by defining narrow criteria for the methods selection for each assessed situation, bearing in mind that the uncertainties are also a relevant factor when dealing with the risk related to nanotechnologies field.
Resumo:
This paper presents an automated optimization framework able to provide network administrators with resilient routing configurations for link-state protocols, such as OSPF or IS-IS. In order to deal with the formulated NP-hard optimization problems, the devised framework is underpinned by the use of computational in- telligence optimization engines, such as Multi-objective Evolutionary Algorithms (MOEAs). With the objective of demonstrating the framework capabilities, two il- lustrative Traffic Engineering methods are described, allowing to attain routing con- figurations robust to changes in the traffic demands and maintaining the network stable even in the presence of link failure events. The presented illustrative results clearly corroborate the usefulness of the proposed automated framework along with the devised optimization methods.
Resumo:
In this work we perform a comparison of two different numerical schemes for the solution of the time-fractional diffusion equation with variable diffusion coefficient and a nonlinear source term. The two methods are the implicit numerical scheme presented in [M.L. Morgado, M. Rebelo, Numerical approximation of distributed order reaction- diffusion equations, Journal of Computational and Applied Mathematics 275 (2015) 216-227] that is adapted to our type of equation, and a colocation method where Chebyshev polynomials are used to reduce the fractional differential equation to a system of ordinary differential equations
Resumo:
PhD Thesis in Bioengineering
Resumo:
PhD thesis in Bioengineering
Resumo:
Shifting from chemical to biotechnological processes is one of the cornerstones of 21st century industry. The production of a great range of chemicals via biotechnological means is a key challenge on the way toward a bio-based economy. However, this shift is occurring at a pace slower than initially expected. The development of efficient cell factories that allow for competitive production yields is of paramount importance for this leap to happen. Constraint-based models of metabolism, together with in silico strain design algorithms, promise to reveal insights into the best genetic design strategies, a step further toward achieving that goal. In this work, a thorough analysis of the main in silico constraint-based strain design strategies and algorithms is presented, their application in real-world case studies is analyzed, and a path for the future is discussed.
Resumo:
Project Management involves onetime endeavors that demand for getting it right the first time. On the other hand, project scheduling, being one of the most modeled project management process stages, still faces a wide gap from theory to practice. Demanding computational models and their consequent call for simplification, divert the implementation of such models in project management tools from the actual day to day project management process. Special focus is being made to the robustness of the generated project schedules facing the omnipresence of uncertainty. An "easy" way out is to add, more or less cleverly calculated, time buffers that always result in project duration increase and correspondingly, in cost. A better approach to deal with uncertainty seems to be to explore slack that might be present in a given project schedule, a fortiori when a non-optimal schedule is used. The combination of such approach to recent advances in modeling resource allocation and scheduling techniques to cope with the increasing flexibility in resources, as can be expressed in "Flexible Resource Constraint Project Scheduling Problem" (FRCPSP) formulations, should be a promising line of research to generate more adequate project management tools. In reality, this approach has been frequently used, by project managers in an ad-hoc way.