817 resultados para Web-based instruction -- Case studies


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Prenatal ultrasound can often reliably distinguish fetal anatomic anomalies, particularly in the hands of an experienced ultrasonographer. Given the large number of existing syndromes and the significant overlap in prenatal findings, antenatal differentiation for syndrome diagnosis is difficult. We constructed a hierarchic tree of 1140 sonographic markers and submarkers, organized per organ system. Subsequently, a database of prenatally diagnosable syndromes was built. An internet-based search engine was then designed to search the syndrome database based on a single or multiple sonographic markers. Future developments will include a database with magnetic resonance imaging findings as well as further refinements in the search engine to allow prioritization based on incidence of syndromes and markers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A procedure was developed for determining Pu-241 activity in environmental samples. This beta emitter isotope of plutonium was measured by ultra low level liquid scintillation, after several separation and purification steps that involved the use of a highly selective extraction chromatographic resin (Eichrom-TEVA). Due to the lack of reference material for Pu-241, the method was nevertheless validated using four IAEA reference sediments with information values for Pu-241. Next, the method was used to determine the Pu-241 activity in alpine soils of Switzerland and France. The Pu-241/Pu-239,Pu-240 and Pu-238/Pu-239,Pu-240 activity ratios confirmed that Pu contamination in the tested alpine soils originated mainly from global fallout from nuclear weapon tests conducted in the fifties and sixties. Estimation of the date of the contamination, using the Pu-241/Am-241 age-dating method, further confirmed this origin. However, the Pu-241/Am-241 dating method was limited to samples where Pu-Am fractionation was insignificant. If any, the contribution of the Chernobyl accident is negligible.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sustainable resource use is one of the most important environmental issues of our times. It is closely related to discussions on the 'peaking' of various natural resources serving as energy sources, agricultural nutrients, or metals indispensable in high-technology applications. Although the peaking theory remains controversial, it is commonly recognized that a more sustainable use of resources would alleviate negative environmental impacts related to resource use. In this thesis, sustainable resource use is analysed from a practical standpoint, through several different case studies. Four of these case studies relate to resource metabolism in the Canton of Geneva in Switzerland: the aim was to model the evolution of chosen resource stocks and flows in the coming decades. The studied resources were copper (a bulk metal), phosphorus (a vital agricultural nutrient), and wood (a renewable resource). In addition, the case of lithium (a critical metal) was analysed briefly in a qualitative manner and in an electric mobility perspective. In addition to the Geneva case studies, this thesis includes a case study on the sustainability of space life support systems. Space life support systems are systems whose aim is to provide the crew of a spacecraft with the necessary metabolic consumables over the course of a mission. Sustainability was again analysed from a resource use perspective. In this case study, the functioning of two different types of life support systems, ARES and BIORAT, were evaluated and compared; these systems represent, respectively, physico-chemical and biological life support systems. Space life support systems could in fact be used as a kind of 'laboratory of sustainability' given that they represent closed and relatively simple systems compared to complex and open terrestrial systems such as the Canton of Geneva. The chosen analysis method used in the Geneva case studies was dynamic material flow analysis: dynamic material flow models were constructed for the resources copper, phosphorus, and wood. Besides a baseline scenario, various alternative scenarios (notably involving increased recycling) were also examined. In the case of space life support systems, the methodology of material flow analysis was also employed, but as the data available on the dynamic behaviour of the systems was insufficient, only static simulations could be performed. The results of the case studies in the Canton of Geneva show the following: were resource use to follow population growth, resource consumption would be multiplied by nearly 1.2 by 2030 and by 1.5 by 2080. A complete transition to electric mobility would be expected to only slightly (+5%) increase the copper consumption per capita while the lithium demand in cars would increase 350 fold. For example, phosphorus imports could be decreased by recycling sewage sludge or human urine; however, the health and environmental impacts of these options have yet to be studied. Increasing the wood production in the Canton would not significantly decrease the dependence on wood imports as the Canton's production represents only 5% of total consumption. In the comparison of space life support systems ARES and BIORAT, BIORAT outperforms ARES in resource use but not in energy use. However, as the systems are dimensioned very differently, it remains questionable whether they can be compared outright. In conclusion, the use of dynamic material flow analysis can provide useful information for policy makers and strategic decision-making; however, uncertainty in reference data greatly influences the precision of the results. Space life support systems constitute an extreme case of resource-using systems; nevertheless, it is not clear how their example could be of immediate use to terrestrial systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Patients with rare diseases such as congenital hypogonadotropic hypogonadism (CHH) are dispersed, often challenged to find specialized care and face other health disparities. The internet has the potential to reach a wide audience of rare disease patients and can help connect patients and specialists. Therefore, this study aimed to: (i) determine if web-based platforms could be effectively used to conduct an online needs assessment of dispersed CHH patients; (ii) identify the unmet health and informational needs of CHH patients and (iii) assess patient acceptability regarding patient-centered, web-based interventions to bridge shortfalls in care. METHODS: A sequential mixed-methods design was used: first, an online survey was conducted to evaluate health promoting behavior and identify unmet health and informational needs of CHH men. Subsequently, patient focus groups were held to explore specific patient-identified targets for care and to examine the acceptability of possible online interventions. Descriptive statistics and thematic qualitative analyses were used. RESULTS: 105 male participants completed the online survey (mean age 37 ± 11, range 19-66 years) representing a spectrum of patients across a broad socioeconomic range and all but one subject had adequate healthcare literacy. The survey revealed periods of non-adherence to treatment (34/93, 37%) and gaps in healthcare (36/87, 41%) exceeding one year. Patient focus groups identified lasting psychological effects related to feelings of isolation, shame and body-image concerns. Survey respondents were active internet users, nearly all had sought CHH information online (101/105, 96%), and they rated the internet, healthcare providers, and online community as equally important CHH information sources. Focus group participants were overwhelmingly positive regarding online interventions/support with links to reach expert healthcare providers and for peer-to-peer support. CONCLUSION: The web-based needs assessment was an effective way to reach dispersed CHH patients. These individuals often have long gaps in care and struggle with the psychosocial sequelae of CHH. They are highly motivated internet users seeking information and tapping into online communities and are receptive to novel web-based interventions addressing their unmet needs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Industrial symbiosis (IS) emerged as a self-organizing business strategy among firms that are willing to cooperate to improve their economic and environmental performance. The adoption of such cooperative strategies relates to increasing costs of waste management, most of which are driven by policy and legislative requirements. Development of IS depends on an enabling context of social, informational, technological, economical and political factors. The power to influence this context varies among the agents involved such as the government, businesses or coordinating entities. Governmental intervention, as manifested through policies, could influence a wider range of factors; and we believe this is an area which is under-researched. This paper aims to critically appraise the waste policy interventions from supra-national to sub-national levels of government. A case study methodology has been applied to four European countries i.e. Denmark, the UK, Portugal and Switzerland, in which IS emerged or is being fostered. The findings suggest that there are commonalities in policy instruments that may have led to an IS enabling context. The paper concludes with lessons learnt and recommendations on shaping the policy context for IS development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En un entorno online la actividad de tutoría de los estudiantes juega un papel fundamental. Su eje central es el acompañamiento del estudiante a lo largo del programa académico que curse, desde el momento en que muestra su interés en matricularse hasta que se titula. La literatura disponible sobre cómo organizar la actividad tutorial en entornos virtuales es escasa. En esta comunicación, basada en la experiencia de la Universitat Oberta de Catalunya (UOC), se analizan dos niveles de especialización de la acción tutorial online. En un primer nivel organizativo, se presenta y evalúa una especialización de la coordinación de la tutoría en función del tipo de conocimiento necesario para ejercerla, académico o administrativo. Esto genera una estructura organizativa de tipo matricial que aporta flexibilidad y conocimiento especializado a la actividad tutorial, y que es valorada muy positivamente por las partes implicadas. En un segundo nivel, se analiza la separación de dos tipos de tutoría especializadas, una de inicio y otra de seguimiento, en función de la antigüedad del estudiante tutorizado. Los resultados de los análisis cualitativos y cuantitativos realizados no permiten concluir que esta segunda forma de especialización contribuya claramente a la mejora de los objetivos de la función tutorial.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: To evaluate web-based information on bipolar disorder and to assess particular content quality indicators. METHODS: Two keywords, "bipolar disorder" and "manic depressive illness" were entered into popular World Wide Web search engines. Websites were assessed with a standardized proforma designed to rate sites on the basis of accountability, presentation, interactivity, readability and content quality. "Health on the Net" (HON) quality label, and DISCERN scale scores were used to verify their efficiency as quality indicators. RESULTS: Of the 80 websites identified, 34 were included. Based on outcome measures, the content quality of the sites turned-out to be good. Content quality of web sites dealing with bipolar disorder is significantly explained by readability, accountability and interactivity as well as a global score. CONCLUSIONS: The overall content quality of the studied bipolar disorder websites is good.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a research concerning the conversion of non-accessible web pages containing mathematical formulae into accessible versions through an OCR (Optical Character Recognition) tool. The objective of this research is twofold. First, to establish criteria for evaluating the potential accessibility of mathematical web sites, i.e. the feasibility of converting non-accessible (non-MathML) math sites into accessible ones (Math-ML). Second, to propose a data model and a mechanism to publish evaluation results, making them available to the educational community who may use them as a quality measurement for selecting learning material.Results show that the conversion using OCR tools is not viable for math web pages mainly due to two reasons: many of these pages are designed to be interactive, making difficult, if not almost impossible, a correct conversion; formula (either images or text) have been written without taking into account standards of math writing, as a consequence OCR tools do not properly recognize math symbols and expressions. In spite of these results, we think the proposed methodology to create and publish evaluation reports may be rather useful in other accessibility assessment scenarios.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of the dissertation is to increase understanding and knowledge in the field where group decision support system (GDSS) and technology selection research overlap in the strategic sense. The purpose is to develop pragmatic, unique and competent management practices and processes for strategic technology assessment and selection from the whole company's point of view. The combination of the GDSS and technology selection is approached from the points of view of the core competence concept, the lead user -method, and different technology types. In this research the aim is to find out how the GDSS contributes to the technology selection process, what aspects should be considered when selecting technologies to be developed or acquired, and what advantages and restrictions the GDSS has in the selection processes. These research objectives are discussed on the basis of experiences and findings in real life selection meetings. The research has been mainly carried outwith constructive, case study research methods. The study contributes novel ideas to the present knowledge and prior literature on the GDSS and technology selection arena. Academic and pragmatic research has been conducted in four areas: 1) the potential benefits of the group support system with the lead user -method,where the need assessment process is positioned as information gathering for the selection of wireless technology development projects; 2) integrated technology selection and core competencies management processes both in theory and in practice; 3) potential benefits of the group decision support system in the technology selection processes of different technology types; and 4) linkages between technology selection and R&D project selection in innovative product development networks. New type of knowledge and understanding has been created on the practical utilization of the GDSS in technology selection decisions. The study demonstrates that technology selection requires close cooperation between differentdepartments, functions, and strategic business units in order to gather the best knowledge for the decision making. The GDSS is proved to be an effective way to promote communication and co-operation between the selectors. The constructs developed in this study have been tested in many industry fields, for example in information and communication, forest, telecommunication, metal, software, and miscellaneous industries, as well as in non-profit organizations. The pragmatic results in these organizations are some of the most relevant proofs that confirm the scientific contribution of the study, according to the principles of the constructive research approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of the thesis is to structure and model the factors that contribute to and can be used in evaluating project success. The purpose of this thesis is to enhance the understanding of three research topics. The goal setting process, success evaluation and decision-making process are studied in the context of a project, business unitand its business environment. To achieve the objective three research questionsare posed. These are 1) how to set measurable project goals, 2) how to evaluateproject success and 3) how to affect project success with managerial decisions.The main theoretical contribution comes from deriving a synthesis of these research topics which have mostly been discussed apart from each other in prior research. The research strategy of the study has features from at least the constructive, nomothetical, and decision-oriented research approaches. This strategy guides the theoretical and empirical part of the study. Relevant concepts and a framework are composed on the basis of the prior research contributions within the problem area. A literature review is used to derive constructs of factors withinthe framework. They are related to project goal setting, success evaluation, and decision making. On the basis of this, the case study method is applied to complement the framework. The empirical data includes one product development program, three construction projects, as well as one organization development, hardware/software, and marketing project in their contexts. In two of the case studiesthe analytic hierarchy process is used to formulate a hierarchical model that returns a numerical evaluation of the degree of project success. It has its origin in the solution idea which in turn has its foundation in the notion of projectsuccess. The achieved results are condensed in the form of a process model thatintegrates project goal setting, success evaluation and decision making. The process of project goal setting is analysed as a part of an open system that includes a project, the business unit and its competitive environment. Four main constructs of factors are suggested. First, the project characteristics and requirements are clarified. The second and the third construct comprise the components of client/market segment attractiveness and sources of competitive advantage. Together they determine the competitive position of a business unit. Fourth, the relevant goals and the situation of a business unit are clarified to stress their contribution to the project goals. Empirical evidence is gained on the exploitation of increased knowledge and on the reaction to changes in the business environment during a project to ensure project success. The relevance of a successful project to a company or a business unit tends to increase the higher the reference level of project goals is set. However, normal performance or sometimes performance below this normal level is intentionally accepted. Success measures make project success quantifiable. There are result-oriented, process-oriented and resource-oriented success measures. The study also links result measurements to enablers that portray the key processes. The success measures can be classified into success domains determining the areas on which success is assessed. Empiricalevidence is gained on six success domains: strategy, project implementation, product, stakeholder relationships, learning situation and company functions. However, some project goals, like safety, can be assessed using success measures that belong to two success domains. For example a safety index is used for assessing occupational safety during a project, which is related to project implementation. Product safety requirements, in turn, are connected to the product characteristics and thus to the product-related success domain. Strategic success measures can be used to weave the project phases together. Empirical evidence on their static nature is gained. In order-oriented projects the project phases are oftencontractually divided into different suppliers or contractors. A project from the supplier's perspective can represent only a part of the ¿whole project¿ viewed from the client's perspective. Therefore static success measures are mostly used within the contractually agreed project scope and duration. Proof is also acquired on the dynamic use of operational success measures. They help to focus on the key issues during each project phase. Furthermore, it is shown that the original success domains and success measures, their weights and target values can change dynamically. New success measures can replace the old ones to correspond better with the emphasis of the particular project phase. This adjustment concentrates on the key decision milestones. As a conclusion, the study suggests a combination of static and dynamic success measures. Their linkage to an incentive system can make the project management proactive, enable fast feedback and enhancethe motivation of the personnel. It is argued that the sequence of effective decisions is closely linked to the dynamic control of project success. According to the used definition, effective decisions aim at adequate decision quality and decision implementation. The findings support that project managers construct and use a chain of key decision milestones to evaluate and affect success during aproject. These milestones can be seen as a part of the business processes. Different managers prioritise the key decision milestones to a varying degree. Divergent managerial perspectives, power, responsibilities and involvement during a project offer some explanation for this. Finally, the study introduces the use ofHard Gate and Soft Gate decision milestones. The managers may use the former milestones to provide decision support on result measurements and ad hoc critical conditions. In the latter milestones they may make intermediate success evaluation also on the basis of other types of success measures, like process and resource measures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the last decade, an important debate has arisen about the characteristics of today"s students due to their intensive experience as users of ICT. The main belief is that frequent use of technologies in everyday life implies that competent users are able to transfer their digital skills to learning activities. However, empirical studies developed in different countries reveal similar results suggesting that the"digital native" label does not provide evidence of a better use of technology to support learning. The debate has to go beyond the characteristics of the new generation and focus on the implications of being a learner in a digitalised world. This paper is based on the hypothesis that the use of technology to support learning is not related to whether a student belongs to the Net Generation, but that it is mainly influenced by the teaching model. The study compares behaviour and preferences towards ICT use in two groups of university students: face-to-face students and online students. A questionnaire was applied to a sample of students from five universities with different characteristics (one offers online education and four offer face-to-face education with LMS teaching support). Findings suggest that although access to and use of ICT is widespread, the influence of teaching methodology is very decisive. For academic purposes, students seem to respond to the requirements of their courses, programmes, and universities. There is a clear relationship between students" perception of usefulness regarding certain ICT resources and their teachers" suggested uses of technologies. The most highly rated technologies correspond with those proposed by teachers. The study shows that the educational model (face-to-face or online) has a stronger influence on students" perception of usefulness regarding ICT support for learning than the fact of being a digital native.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this study, a wrapper approach was applied to objectively select the most important variables related to two different anaerobic digestion imbalances, acidogenic states and foaming. This feature selection method, implemented in artificial neural networks (ANN), was performed using input and output data from a fully instrumented pilot plant (1 m 3 upflow fixed bed digester). Results for acidogenic states showed that pH, volatile fatty acids, and inflow rate were the most relevant variables. Results for foaming showed that inflow rate and total organic carbon were among the relevant variables, both of which were related to the feed loading of the digester. Because there is not a complete agreement on the causes of foaming, these results highlight the role of digester feeding patterns in the development of foaming

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The article discusses the development of WEBDATANET established in 2011 which aims to create a multidisciplinary network of web-based data collection experts in Europe. Topics include the presence of 190 experts in 30 European countries and abroad, the establishment of web-based teaching and discussion platforms and working groups and task forces. Also discussed is the scope of the research carried by WEBDATANET. In light of the growing importance of web-based data in the social and behavioral sciences, WEBDATANET was established in 2011 as a COST Action (IS 1004) to create a multidisciplinary network of web-based data collection experts: (web) survey methodologists, psychologists, sociologists, linguists, economists, Internet scientists, media and public opinion researchers. The aim was to accumulate and synthesize knowledge regarding methodological issues of web-based data collection (surveys, experiments, tests, non-reactive data, and mobile Internet research), and foster its scientific usage in a broader community.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We advocate the advantage of an evolutionary approach to conservation biology that considers evolutionary history at various levels of biological organization. We review work on three separate plant taxa, spanning from one to multiple decades, illustrating extremes in metapopulation functioning. We show how the rare endemics Centaurea corymbosa (Clape Massif, France) and Brassica insularis in Corsica (France) may be caught in an evolutionary trap: disruption of metapopulation functioning due to lack of colonization of new sites may have counterselected traits such as dispersal ability or self-compatibility, making these species particularly vulnerable to any disturbance. The third case study concerns the evolution of life history strategies in the highly diverse genus Leucadendron of the South African fynbos. There, fire disturbance and the recolonization phase after fires are so integral to the functioning of populations that recruitment of new individuals is conditioned by fire. We show how past adaptation to different fire regimes and climatic constraints make species with different life history syndromes more or less vulnerable to global changes. These different case studies suggest that management strategies should promote evolutionary potential and evolutionary processes to better protect extant biodiversity and biodiversification.