961 resultados para scientific community
Resumo:
The so called cascading events, which lead to high-impact low-frequency scenarios are rising concern worldwide. A chain of events result in a major industrial accident with dreadful (and often unpredicted) consequences. Cascading events can be the result of the realization of an external threat, like a terrorist attack a natural disaster or of “domino effect”. During domino events the escalation of a primary accident is driven by the propagation of the primary event to nearby units, causing an overall increment of the accident severity and an increment of the risk associated to an industrial installation. Also natural disasters, like intense flooding, hurricanes, earthquake and lightning are found capable to enhance the risk of an industrial area, triggering loss of containment of hazardous materials and in major accidents. The scientific community usually refers to those accidents as “NaTechs”: natural events triggering industrial accidents. In this document, a state of the art of available approaches to the modelling, assessment, prevention and management of domino and NaTech events is described. On the other hand, the relevant work carried out during past studies still needs to be consolidated and completed, in order to be applicable in a real industrial framework. New methodologies, developed during my research activity, aimed at the quantitative assessment of domino and NaTech accidents are presented. The tools and methods provided within this very study had the aim to assist the progress toward a consolidated and universal methodology for the assessment and prevention of cascading events, contributing to enhance safety and sustainability of the chemical and process industry.
Resumo:
Nowadays, in developed countries, the excessive food intake, in conjunction with a decreased physical activity, has led to an increase in lifestyle-related diseases, such as obesity, cardiovascular diseases, type -2 diabetes, a range of cancer types and arthritis. The socio-economic importance of such lifestyle-related diseases has encouraged countries to increase their efforts in research, and many projects have been initiated recently in research that focuses on the relationship between food and health. Thanks to these efforts and to the growing availability of technologies, the food companies are beginning to develop healthier food. The necessity of rapid and affordable methods, helping the food industries in the ingredient selection has stimulated the development of in vitro systems that simulate the physiological functions to which the food components are submitted when administrated in vivo. One of the most promising tool now available appears the in vitro digestion, which aims at predicting, in a comparative way among analogue food products, the bioaccessibility of the nutrients of interest.. The adoption of the foodomics approach has been chosen in this work to evaluate the modifications occurring during the in vitro digestion of selected protein-rich food products. The measure of the proteins breakdown was performed via NMR spectroscopy, the only techniques capable of observing, directly in the simulated gastric and duodenal fluids, the soluble oligo- and polypeptides released during the in vitro digestion process. The overall approach pioneered along this PhD work, has been discussed and promoted in a large scientific community, with specialists networked under the INFOGEST COST Action, which recently released a harmonized protocol for the in vitro digestion. NMR spectroscopy, when used in tandem with the in vitro digestion, generates a new concept, which provides an additional attribute to describe the food quality: the comparative digestibility, which measures the improvement of the nutrients bioaccessibility.
Resumo:
The interest of the scientific community towards organic pollutants in freshwater streams is fairly recent. During the past 50 years, thousands of chemicals have been synthesized and released into the general environment. Nowadays their occurrence and effects on several organism, invertebrates, fish, birds, reptiles and also humans are well documented. Because of their action, some of these chemicals have been defined as Endocrine Disrupters Compounds (EDCs) and the public health implications of these EDCs have been the subject of scientific debate. Most interestingly, among those that were noticed to have some influence and effects on the endocrine system were the estrone, the 17β-estradiol, the 17α-estradiol, the estriol, the 17α-ethinylestradiol, the testosterone and the progesterone. This project focused its attention on the 17β-estradiol. Estradiol, or more precisely, 17β-estradiol (also commonly referred to as E2) is a human sex hormone. It belongs to the class of steroid hormones. In spite of the effort to remove these substances from the effluents, the actual wastewater treatment plants are not able to degrade or inactivate these organic compounds that are continually poured in the ecosystem. Through this work a new system for the wastewater treatment was tested, to assess the decrease of the estradiol in the water. It involved the action of Chlorella vulgaris, a fresh water green microalga belonging to the family of the Chlorellaceae. This microorganism was selected for its adaptability and for its photosynthetic efficiency. To detect the decrease of the target compound in the water a CALUX bioassay analysis was chosen. Three different experiments were carried on to pursue the aim of the project. By analysing their results several aspects emerged. It was assessed the presence of EDCs inside the water used to prepare the culture media. C. vulgaris, under controlled conditions, could be efficient for this purpose, although further researches are essential to deepen the knowledge of this complex phenomenon. Ultimately by assessing the toxicity of the effluent against C. vulgaris, it was clear that at determined concentrations, it could affect the normal growth rate of this microorganism.
Resumo:
Die Arbeit untersucht die Zusammenhänge zwischen verschiedenen Typen autokratischer Regime sowie der relativen Wahrscheinlichkeit von Staaten, an bewaffneten Konflikten beteiligt zu sein. Es wird dabei ein weiter Begriff von bewaffneten Konflikten verwendet, der nach verschiedenen Typen bewaffneter Konflikte unterscheidet sowie in der Forschung diskutierte Veränderungen der Bedingungen des internationalen Systems in der Post-Kalte-Krieg-Ära berücksichtigt.
Resumo:
The research for exact solutions of mixed integer problems is an active topic in the scientific community. State-of-the-art MIP solvers exploit a floating- point numerical representation, therefore introducing small approximations. Although such MIP solvers yield reliable results for the majority of problems, there are cases in which a higher accuracy is required. Indeed, it is known that for some applications floating-point solvers provide falsely feasible solutions, i.e. solutions marked as feasible because of approximations that would not pass a check with exact arithmetic and cannot be practically implemented. The framework of the current dissertation is SCIP, a mixed integer programs solver mainly developed at Zuse Institute Berlin. In the same site we considered a new approach for exactly solving MIPs. Specifically, we developed a constraint handler to plug into SCIP, with the aim to analyze the accuracy of provided floating-point solutions and compute exact primal solutions starting from floating-point ones. We conducted a few computational experiments to test the exact primal constraint handler through the adoption of two main settings. Analysis mode allowed to collect statistics about current SCIP solutions' reliability. Our results confirm that floating-point solutions are accurate enough with respect to many instances. However, our analysis highlighted the presence of numerical errors of variable entity. By using the enforce mode, our constraint handler is able to suggest exact solutions starting from the integer part of a floating-point solution. With the latter setting, results show a general improvement of the quality of provided final solutions, without a significant loss of performances.
Resumo:
In recent years, Deep Learning techniques have shown to perform well on a large variety of problems both in Computer Vision and Natural Language Processing, reaching and often surpassing the state of the art on many tasks. The rise of deep learning is also revolutionizing the entire field of Machine Learning and Pattern Recognition pushing forward the concepts of automatic feature extraction and unsupervised learning in general. However, despite the strong success both in science and business, deep learning has its own limitations. It is often questioned if such techniques are only some kind of brute-force statistical approaches and if they can only work in the context of High Performance Computing with tons of data. Another important question is whether they are really biologically inspired, as claimed in certain cases, and if they can scale well in terms of "intelligence". The dissertation is focused on trying to answer these key questions in the context of Computer Vision and, in particular, Object Recognition, a task that has been heavily revolutionized by recent advances in the field. Practically speaking, these answers are based on an exhaustive comparison between two, very different, deep learning techniques on the aforementioned task: Convolutional Neural Network (CNN) and Hierarchical Temporal memory (HTM). They stand for two different approaches and points of view within the big hat of deep learning and are the best choices to understand and point out strengths and weaknesses of each of them. CNN is considered one of the most classic and powerful supervised methods used today in machine learning and pattern recognition, especially in object recognition. CNNs are well received and accepted by the scientific community and are already deployed in large corporation like Google and Facebook for solving face recognition and image auto-tagging problems. HTM, on the other hand, is known as a new emerging paradigm and a new meanly-unsupervised method, that is more biologically inspired. It tries to gain more insights from the computational neuroscience community in order to incorporate concepts like time, context and attention during the learning process which are typical of the human brain. In the end, the thesis is supposed to prove that in certain cases, with a lower quantity of data, HTM can outperform CNN.
Resumo:
Groundwater represents one of the most important resources of the world and it is essential to prevent its pollution and to consider remediation intervention in case of contamination. According to the scientific community the characterization and the management of the contaminated sites have to be performed in terms of contaminant fluxes and considering their spatial and temporal evolution. One of the most suitable approach to determine the spatial distribution of pollutant and to quantify contaminant fluxes in groundwater is using control panels. The determination of contaminant mass flux, requires measurement of contaminant concentration in the moving phase (water) and velocity/flux of the groundwater. In this Master Thesis a new solute flux mass measurement approach, based on an integrated control panel type methodology combined with the Finite Volume Point Dilution Method (FVPDM), for the monitoring of transient groundwater fluxes, is proposed. Moreover a new adsorption passive sampler, which allow to capture the variation of solute concentration with time, is designed. The present work contributes to the development of this approach on three key points. First, the ability of the FVPDM to monitor transient groundwater fluxes was verified during a step drawdown test at the experimental site of Hermalle Sous Argentau (Belgium). The results showed that this method can be used, with optimal results, to follow transient groundwater fluxes. Moreover, it resulted that performing FVPDM, in several piezometers, during a pumping test allows to determine the different flow rates and flow regimes that can occurs in the various parts of an aquifer. The second field test aiming to determine the representativity of a control panel for measuring mass flus in groundwater underlined that wrong evaluations of Darcy fluxes and discharge surfaces can determine an incorrect estimation of mass fluxes and that this technique has to be used with precaution. Thus, a detailed geological and hydrogeological characterization must be conducted, before applying this technique. Finally, the third outcome of this work concerned laboratory experiments. The test conducted on several type of adsorption material (Oasis HLB cartridge, TDS-ORGANOSORB 10 and TDS-ORGANOSORB 10-AA), in order to determine the optimum medium to dimension the passive sampler, highlighted the necessity to find a material with a reversible adsorption tendency to completely satisfy the request of the new passive sampling technique.
Resumo:
Our generation of computational scientists is living in an exciting time: not only do we get to pioneer important algorithms and computations, we also get to set standards on how computational research should be conducted and published. From Euclid’s reasoning and Galileo’s experiments, it took hundreds of years for the theoretical and experimental branches of science to develop standards for publication and peer review. Computational science, rightly regarded as the third branch, can walk the same road much faster. The success and credibility of science are anchored in the willingness of scientists to expose their ideas and results to independent testing and replication by other scientists. This requires the complete and open exchange of data, procedures and materials. The idea of a “replication by other scientists” in reference to computations is more commonly known as “reproducible research”. In this context the journal “EAI Endorsed Transactions on Performance & Modeling, Simulation, Experimentation and Complex Systems” had the exciting and original idea to make the scientist able to submit simultaneously the article and the computation materials (software, data, etc..) which has been used to produce the contents of the article. The goal of this procedure is to allow the scientific community to verify the content of the paper, reproducing it in the platform independently from the OS chosen, confirm or invalidate it and especially allow its reuse to reproduce new results. This procedure is therefore not helpful if there is no minimum methodological support. In fact, the raw data sets and the software are difficult to exploit without the logic that guided their use or their production. This led us to think that in addition to the data sets and the software, an additional element must be provided: the workflow that relies all of them.
Resumo:
Water vapour, despite being a minor constituent in the Martian atmosphere with its precipitable amount of less than 70 pr. μm, attracts considerable attention in the scientific community because of its potential importance for past life on Mars. The partial pressure of water vapour is highly variable because of its seasonal condensation onto the polar caps and exchange with a subsurface reservoir. It is also known to drive photochemical processes: photolysis of water produces H, OH, HO2 and some other odd hydrogen compounds, which in turn destroy ozone. Consequently, the abundance of water vapour is anti-correlated with ozone abundance. The Herschel Space Observatory provides for the first time the possibility to retrieve vertical water profiles in the Martian atmosphere. Herschel will contribute to this topic with its guaranteed-time key project called "Water and related chemistry in the solar system". Observations of Mars by Heterodyne Instrument for the Far Infrared (HIFI) and Photodetector Array Camera and Spectrometer (PACS) onboard Herschel are planned in the frame of the programme. HIFI with its high spectral resolution enables accurate observations of vertically resolved H2O and temperature profiles in the Martian atmosphere. Unlike HIFI, PACS is not capable of resolving the line-shape of molecular lines. However, our present study of PACS observations for the Martian atmosphere shows that the vertical sensitivity of the PACS observations can be improved by using multiple-line observations with different line opacities. We have investigated the possibility of retrieving vertical profiles of temperature and molecular abundances of minor species including H2O in the Martian atmosphere using PACS. In this paper, we report that PACS is able to provide water vapour vertical profiles for the Martian atmosphere and we present the expected spectra for future PACS observations. We also show that the spectral resolution does not allow the retrieval of several studied minor species, such as H2O2, HCl, NO, SO2, etc.
Resumo:
Balancing the frequently conflicting priorities of conservation and economic development poses a challenge to management of the Swiss Alps Jungfrau- Aletsch World Heritage Site (WHS). This is a complex societal problem that calls for a knowledge-based solution. This in turn requires a transdisciplinary research framework in which problems are defined and solved cooperatively by actors from the scientific community and the life-world. In this article we re-examine studies carried out in the region of the Swiss Alps Jungfrau-Aletsch WHS, covering three key issues prevalent in transdisciplinary settings: integration of stakeholders into participatory processes; perceptions and positions; and negotiability and implementation. In the case of the Swiss Alps Jungfrau-Aletsch WHS the transdisciplinary setting created a situation of mutual learning among stakeholders from different levels and backgrounds. However, the studies showed that the benefits of such processes of mutual learning are continuously at risk of being diminished by the power play inherent in participatory approaches.
Resumo:
When it comes to helping to shape sustainable development, research is most useful when it bridges the science–implementation/management gap and when it brings development specialists and researchers into a dialogue (Hurni et al. 2004); can a peer-reviewed journal contribute to this aim? In the classical system for validation and dissemination of scientific knowledge, journals focus on knowledge exchange within the academic community and do not specifically address a ‘life-world audience’. Within a North-South context, another knowledge divide is added: the peer review process excludes a large proportion of scientists from the South from participating in the production of scientific knowledge (Karlsson et al. 2007). Mountain Research and Development (MRD) is a journal whose mission is based on an editorial strategy to build the bridge between research and development and ensure that authors from the global South have access to knowledge production, ultimately with a view to supporting sustainable development in mountains. In doing so, MRD faces a number of challenges that we would like to discuss with the td-net community, after having presented our experience and strategy as editors of this journal. MRD was launched in 1981 by mountain researchers who wanted mountains to be included in the 1992 Rio process. In the late 1990s, MRD realized that the journal needed to go beyond addressing only the scientific community. It therefore launched a new section addressing a broader audience in 2000, with the aim of disseminating insights into, and recommendations for, the implementation of sustainable development in mountains. In 2006, we conducted a survey among MRD’s authors, reviewers, and readers (Wymann et al. 2007): respondents confirmed that MRD had succeeded in bridging the gap between research and development. But we realized that MRD could become an even more efficient tool for sustainability if development knowledge were validated: in 2009, we began submitting ‘development’ papers (‘transformation knowledge’) to external peer review of a kind different from the scientific-only peer review (for ‘systems knowledge’). At the same time, the journal became open access in order to increase the permeability between science and society, and ensure greater access for readers and authors in the South. We are currently rethinking our review process for development papers, with a view to creating more space for communication between science and society, and enhancing the co-production of knowledge (Roux 2008). Hopefully, these efforts will also contribute to the urgent debate on the ‘publication culture’ needed in transdisciplinary research (Kueffer et al. 2007).
Resumo:
Partnership Actions for Mitigating Syndromes (PAMS) are small transdisciplinary projects which bring scientific research insights from the NCCR North-South into policy and practice. They are implemented by researchers from different disciplines in collaboration with non-scientific actors. PAMS aim to implement and test approaches, methods and tools developed in research, in order to identify promising strategies and potentials for sustainable development. In this sense, they are solution-oriented. This paper will provide insights into our experience with PAMS, with a special focus on the implementation of transdisciplinarity and its outcomes. From 2001 to 2010, 77 PAMS were implemented in Africa, Asia and Latin America. An internal evaluation of the first 55 projects was conducted in 2006. Results of this evaluation led to a refinement and improvement of the tool. A second internal evaluation is currently underway in the NCCR North-South. This evaluation will provide an overview of 22 new PAMS. We will look at partners involved, project beneficiaries, activities implemented, outcomes achieved, and lessons learnt. In the first evaluation, transdisciplinarity was considered as “a form of collaboration within scientific fields … and as a form of continuous dialogue between research and society” (Messerli et al., 2007). The evaluation report concluded that this understanding of transdisciplinarity was not satisfactorily applied in the 55 projects. Only about half of the PAMS addressed mutual exchange between researchers and society. Some involved only one specific field of research and clearly lacked interdisciplinary co-operation, and most often knowledge was transferred mainly unilaterally from the scientific community to society, without society having any effect on science. It was therefore recommended to address transdisciplinarity more carefully in Phase 2 PAMS. The second evaluation, which is currently under way, is analysing whether and how this recommendation has been met, based on criteria defined in the NCCR North-South’s Outcome Monitoring Strategy. The analysis is focusing on partners with whom researchers interact and investigating whether practices have changed both in research and society. We are also exploring the role of researchers in PAMS. Preliminary results show that researchers can assume different roles, from direct implementation, mediation, and promotion of social learning between different actors, to giving advice as neutral outsiders.
Resumo:
Background Inappropriate cross talk between mammals and their gut microbiota may trigger intestinal inflammation and drive extra-intestinal immune-mediated diseases. Epithelial cells constitute the interface between gut microbiota and host tissue, and may regulate host responses to commensal enteric bacteria. Gnotobiotic animals represent a powerful approach to study bacterial-host interaction but are not readily accessible to the wide scientific community. We aimed at refining a protocol that in a robust manner would deplete the cultivable intestinal microbiota of conventionally raised mice and that would prove to have significant biologic validity. Methodology/Principal Findings Previously published protocols for depleting mice of their intestinal microbiota by administering broad-spectrum antibiotics in drinking water were difficult to reproduce. We show that twice daily delivery of antibiotics by gavage depleted mice of their cultivable fecal microbiota and reduced the fecal bacterial DNA load by 400 fold while ensuring the animals' health. Mice subjected to the protocol for 17 days displayed enlarged ceca, reduced Peyer's patches and small spleens. Antibiotic treatment significantly reduced the expression of antimicrobial factors to a level similar to that of germ-free mice and altered the expression of 517 genes in total in the colonic epithelium. Genes involved in cell cycle were significantly altered concomitant with reduced epithelial proliferative activity in situ assessed by Ki-67 expression, suggesting that commensal microbiota drives cellular proliferation in colonic epithelium. Conclusion We present a robust protocol for depleting conventionally raised mice of their cultivatable intestinal microbiota with antibiotics by gavage and show that the biological effect of this depletion phenocopies physiological characteristics of germ-free mice.
Resumo:
Fuel cells are a topic of high interest in the scientific community right now because of their ability to efficiently convert chemical energy into electrical energy. This thesis is focused on solid oxide fuel cells (SOFCs) because of their fuel flexibility, and is specifically concerned with the anode properties of SOFCs. The anodes are composed of a ceramic material (yttrium stabilized zirconia, or YSZ), and conducting material. Recent research has shown that an infiltrated anode may offer better performance at a lower cost. This thesis focuses on the creation of a model of an infiltrated anode that mimics the underlying physics of the production process. Using the model, several key parameters for anode performance are considered. These are the initial volume fraction of YSZ in the slurry before sintering, the final porosity of the composite anode after sintering, and the size of the YSZ and conducting particles in the composite. The performance measures of the anode, namely percolation threshold and effective conductivity, are analyzed as a function of these important input parameters. Simple two and three-dimensional percolation models are used to determine the conditions at which the full infiltrated anode would be investigated. These more simple models showed that the aspect ratio of the anode has no effect on the threshold or effective conductivity, and that cell sizes of 303 are needed to obtain accurate conductivity values. The full model of the infiltrated anode is able to predict the performance of the SOFC anodes and it can be seen that increasing the size of the YSZ decreases the percolation threshold and increases the effective conductivity at low conductor loadings. Similar trends are seen for a decrease in final porosity and a decrease in the initial volume fraction of YSZ.
Resumo:
With many different investigators studying the same disease and with a strong commitment to publish supporting data in the scientific community, there are often many different datasets available for any given disease. Hence there is substantial interest in finding methods for combining these datasets to provide better and more detailed understanding of the underlying biology. We consider the synthesis of different microarray data sets using a random effects paradigm and demonstrate how relatively standard statistical approaches yield good results. We identify a number of important and substantive areas which require further investigation.