976 resultados para scientific methodology
Resumo:
Microbial fuel cell (MFC) research is a rapidly evolving field that lacks established terminology and methods for the analysis of system performance. This makes it difficult for researchers to compare devices on an equivalent basis. The construction and analysis of MFCs requires knowledge of different scientific and engineering fields, ranging from microbiology and electrochemistry to materials and environmental engineering. DescribingMFCsystems therefore involves an understanding of these different scientific and engineering principles. In this paper, we provide a review of the different materials and methods used to construct MFCs, techniques used to analyze system performance, and recommendations on what information to include in MFC studies and the most useful ways to present results.
Resumo:
Estudo comparativo que analisa a divulgação científica (DC) praticada pelas revistas Ciência Hoje (CH), Scientific American Brasil (SAB) e Superinteressante (SI), apontando convergências e divergências entre as três publicações. Objetiva-se analisar como as formas de construção textual e o uso de ilustrações nas matérias e nos artigos de capa das revistas CH, SAB e SI podem contribuir ou interferir de maneira efetiva na DC que praticam. Para tanto, são delineados cinco pressupostos básicos: (1) as publicações de análise priorizam nas capas temas pertencentes às Ciências Básicas (CB) em detrimento das Ciências Humanas e Sociais (CHS); (2) no campo amplo das CB há preferência, nas capas das revistas, por temas relacionados à saúde; (3) as temáticas abordadas nas capas das revistas CH, SAB e SI não são, em geral, coincidentes entre si, pois não seguem uma lógica de matérias quentes ; (4) o uso freqüente de elementos explicativos, termos figurados, fontes de informação diversificadas e citações diretas nas construções textuais das matérias e dos artigos de capa das publicações, bem como o uso de ilustrações devidamente contextualizadas, contribuem para tornar os textos mais inteligíveis; (5) as revistas CH, SAB e SI, embora consideradas revistas de DC, apresentam níveis diferenciados de divulgação, em função do perfil de seus leitores. Em consonância com esses pressupostos, são objetivos específicos: a) identificar, dentro das duas amplas categorias (CB e CHS), os temas mais explorados, reunindo-os em subcategorias para identificar mais afinidade / proximidade entre eles; b) examinar, através das formas de construção textual e do uso das ilustrações nas matérias e nos artigos de capa, os critérios utilizados pelas publicações para divulgar ciência e tecnologia (C&T). Para responder os parâmetros estabelecidos nos objetivos apresentados, a metodologia inclui questionário aplicado aos editores das publicações investigadas e análise de conteúdo (AC) da amostra selecionada, que engloba 19 matérias / artigos de capa das revistas CH, SAB e SI, escolhidos entre julho de 2009 e junho de 2010. Os dados coletados e devidamente discutidos permitem confirmar os pressupostos enunciados, vez que, em termos gerais, é evidente que a DC praticada pelas três revistas apresenta mais divergências do que convergências. Isto possibilita estabelecer níveis distintos de divulgação, manifestos na forma como constroem os textos e como utilizam as ilustrações, com maior dificuldade em SAB e CH e com teor mais simplificado em SI.
Resumo:
The thesis deals with the background, development and description of a mathematical stock control methodology for use within an oil and chemical blending company, where demand and replenishment lead-times are generally non-stationary. The stock control model proper relies on, as input, adaptive forecasts of demand determined for an economical forecast/replenishment period precalculated on an individual stock-item basis. The control procedure is principally that of the continuous review, reorder level type, where the reorder level and reorder quantity 'float', that is, each changes in accordance with changes in demand. Two versions of the Methodology are presented; a cost minimisation version and a service level version. Realising the importance of demand forecasts, four recognised variations of the Trigg and Leach adaptive forecasting routine are examined. A fifth variation, developed, is proposed as part of the stock control methodology. The results of testing the cost minimisation version of the Methodology with historical data, by means of a computerised simulation, are presented together with a description of the simulation used. The performance of the Methodology is in addition compared favourably to a rule-of-thumb approach considered by the Company as an interim solution for reducing stack levels. The contribution of the work to the field of scientific stock control is felt to be significant for the following reasons:- (I) The Methodology is designed specifically for use with non-stationary demand and for this reason alone appears to be unique. (2) The Methodology is unique in its approach and the cost-minimisation version is shown to work successfully with the demand data presented. (3) The Methodology and the thesis as a whole fill an important gap between complex mathematical stock control theory and practical application. A brief description of a computerised order processing/stock monitoring system, designed and implemented as a pre-requisite for the Methodology's practical operation, is presented as an appendix.
Resumo:
Intermittent photic stimulation (IPS) is a common procedure performed in the electroencephalography (EEG) laboratory in children and adults to detect abnormal epileptogenic sensitivity to flickering light (i.e., photosensitivity). In practice, substantial variability in outcome is anecdotally found due to the many different methods used per laboratory and country. We believe that standardization of procedure, based on scientific and clinical data, should permit reproducible identification and quantification of photosensitivity. We hope that the use of our new algorithm will help in standardizing the IPS procedure, which in turn may more clearly identify and assist monitoring of patients with epilepsy and photosensitivity. Our algorithm goes far beyond that published in 1999 (Epilepsia, 1999a, 40, 75; Neurophysiol Clin, 1999b, 29, 318): it has substantially increased content, detailing technical and logistical aspects of IPS testing and the rationale for many of the steps in the IPS procedure. Furthermore, our latest algorithm incorporates the consensus of repeated scientific meetings of European experts in this field over a period of 6 years with feedback from general neurologists and epileptologists to improve its validity and utility. Accordingly, our European group has provided herein updated algorithms for two different levels of methodology: (1) requirements for defining photosensitivity in patients and in family members of known photosensitive patients and (2) requirements for tailored studies in patients with a clear history of visually induced seizures or complaints, and in those already known to be photosensitive.
Resumo:
Humans consciously and subconsciously establish various links, emerge semantic images and reason in mind, learn linking effect and rules, select linked individuals to interact, and form closed loops through links while co-experiencing in multiple spaces in lifetime. Machines are limited in these abilities although various graph-based models have been used to link resources in the cyber space. The following are fundamental limitations of machine intelligence: (1) machines know few links and rules in the physical space, physiological space, psychological space, socio space and mental space, so it is not realistic to expect machines to discover laws and solve problems in these spaces; and, (2) machines can only process pre-designed algorithms and data structures in the cyber space. They are limited in ability to go beyond the cyber space, to learn linking rules, to know the effect of linking, and to explain computing results according to physical, physiological, psychological and socio laws. Linking various spaces will create a complex space — the Cyber-Physical-Physiological-Psychological-Socio-Mental Environment CP3SME. Diverse spaces will emerge, evolve, compete and cooperate with each other to extend machine intelligence and human intelligence. From multi-disciplinary perspective, this paper reviews previous ideas on various links, introduces the concept of cyber-physical society, proposes the ideal of the CP3SME including its definition, characteristics, and multi-disciplinary revolution, and explores the methodology of linking through spaces for cyber-physical-socio intelligence. The methodology includes new models, principles, mechanisms, scientific issues, and philosophical explanation. The CP3SME aims at an ideal environment for humans to live and work. Exploration will go beyond previous ideals on intelligence and computing.
Resumo:
Purpose - The purpose of this paper is to assess high-dimensional visualisation, combined with pattern matching, as an approach to observing dynamic changes in the ways people tweet about science topics. Design/methodology/approach - The high-dimensional visualisation approach was applied to three scientific topics to test its effectiveness for longitudinal analysis of message framing on Twitter over two disjoint periods in time. The paper uses coding frames to drive categorisation and visual analytics of tweets discussing the science topics. Findings - The findings point to the potential of this mixed methods approach, as it allows sufficiently high sensitivity to recognise and support the analysis of non-trending as well as trending topics on Twitter. Research limitations/implications - Three topics are studied and these illustrate a range of frames, but results may not be representative of all scientific topics. Social implications - Funding bodies increasingly encourage scientists to participate in public engagement. As social media provides an avenue actively utilised for public communication, understanding the nature of the dialog on this medium is important for the scientific community and the public at large. Originality/value - This study differs from standard approaches to the analysis of microblog data, which tend to focus on machine driven analysis large-scale datasets. It provides evidence that this approach enables practical and effective analysis of the content of midsize to large collections of microposts.
Resumo:
How can applications be deployed on the cloud to achieve maximum performance? This question is challenging to address with the availability of a wide variety of cloud Virtual Machines (VMs) with different performance capabilities. The research reported in this paper addresses the above question by proposing a six step benchmarking methodology in which a user provides a set of weights that indicate how important memory, local communication, computation and storage related operations are to an application. The user can either provide a set of four abstract weights or eight fine grain weights based on the knowledge of the application. The weights along with benchmarking data collected from the cloud are used to generate a set of two rankings - one based only on the performance of the VMs and the other takes both performance and costs into account. The rankings are validated on three case study applications using two validation techniques. The case studies on a set of experimental VMs highlight that maximum performance can be achieved by the three top ranked VMs and maximum performance in a cost-effective manner is achieved by at least one of the top three ranked VMs produced by the methodology.
Resumo:
Objectives: To explore the content and methodology of predoctoral Geriatric Dentistry teaching amongst European dental schools.
Methods: The study was conducted by the European College of Gerodontology (ECG) Education Committee. Αn electronic questionnaire has been developed with close and open-ended items, including information on the prevalence and institutional anchorage of Gerodontology programs, the educators, the content and the methodology of teaching. An electronic mail, including a hyperlink to the questionnaire, was sent to 216 dental schools in 39 European countries (Winter/ Spring 2016). The Deans were asked to either answer themselves, or forward the link to faculty members with knowledge on Gerodontology teaching at their respective schools. Repeated reminders or telephone calls were used for non-respondents and personal networks were exploited to identify potential contact persons.
Results: Until August 2016, 121 dental schools from 29 countries responded to the survey (response rate 56%, EU response rate: 60%). Gerodontology was included in the predoctoral curricula of 86% of the respondents and was compulsory in 68%. The course was mainly offered in senior students and was interdisciplinary in 30% of the schools, delivered mainly by dentists (79%), physicians (21%), psychologists (10%), and nurses (5%). It was conducted as an independent lecture series in 40% of the schools and a course director was assigned in 44% of the respondents. When embedded in other disciplines, these were mainly Prosthodontics (31%). The content included a large number of items, such as epidemiology of oral health, medical problems in old age, prosthodontic management, xerostomia, and caries risk assessment. Lectures were the most common teaching format (69%), followed by small group seminars (27%). The most common types of educational material used were scientific articles (48%), printed textbooks (44%), lecture notes (40%) and e-learning material (21%). Clinical training was offered by 64% of the respondents, within the dental school clinics (49%) and/or in outreach locations (40%).
Conclusion: Amongst the respondent European dental schools (66%) there is an increasing number that teach Gerodontology at a pre-doctoral level with significant variations in content and methodology. Official guidelines and the dissemination of the ECG pre-doctoral curriculum guidelines might help to increase the prevalence and improve the status of Gerodontology teaching in Europe.
Resumo:
There is scientific evidence demonstrating the benefits of mushrooms ingestion due to their richness in bioactive compounds such as mycosterols, in particular ergosterol [I]. Agaricus bisporus L. is the most consumed mushroom worldwide presenting 90% of ergosterol in its sterol fraction [2]. Thus, it is an interesting matrix to obtain ergosterol, a molecule with a high commercial value. According to literature, ergosterol concentration can vary between 3 to 9 mg per g of dried mushroom. Nowadays, traditional methods such as maceration and Soxhlet extraction are being replaced by emerging methodologies such as ultrasound (UAE) and microwave assisted extraction (MAE) in order to decrease the used solvent amount, extraction time and, of course, increasing the extraction yield [2]. In the present work, A. bisporus was extracted varying several parameters relevant to UAE and MAE: UAE: solvent type (hexane and ethanol), ultrasound amplitude (50 - 100 %) and sonication time (5 min-15 min); MAE: solvent was fixed as ethanol, time (0-20 min), temperature (60-210 •c) and solid-liquid ratio (1-20 g!L). Moreover, in order to decrease the process complexity, the pertinence to apply a saponification step was evaluated. Response surface methodology was applied to generate mathematical models which allow maximizing and optimizing the response variables that influence the extraction of ergosterol. Concerning the UAE, ethanol proved to be the best solvent to achieve higher levels of ergosterol (671.5 ± 0.5 mg/100 g dw, at 75% amplitude for 15 min), once hexane was only able to extract 152.2 ± 0.2 mg/100 g dw, in the same conditions. Nevertheless, the hexane extract showed higher purity (11%) when compared with the ethanol counterpart ( 4% ). Furthermore, in the case of the ethanolic extract, the saponification step increased its purity to 21%, while for the hexane extract the purity was similar; in fact, hexane presents higher selectivity for the lipophilic compounds comparatively with ethanol. Regarding the MAE technique, the results showed that the optimal conditions (19 ± 3 min, 133 ± 12 •c and 1.6 ± 0.5 g!L) allowed higher ergosterol extraction levels (556 ± 26 mg/100 g dw). The values obtained with MAE are close to the ones obtained with conventional Soxhlet extraction (676 ± 3 mg/100 g dw) and UAE. Overall, UAE and MAE proved to he efficient technologies to maximize ergosterol extraction yields.
Resumo:
Betacyanins are betalain pigments that display a red-violet colour which have been reported to be three times stronger than the red-violet dye produced by anthocyanins [1]. The applications of betacyanins cover a wide range of matrices, mainly as additives or ingredients in the food industry, cosmetics, pharmaceuticals and livestock feed. Although, being less commonly used than anthocyanins and carotenoids, betacyanins are stable between pH 3 to 7 and suitable for colouring in low acid matrices. In addition, betacyanins have been reported to display interesting medicinal character as powerful antioxidant and chemopreventive compounds either in vitro or in vivo models [2]. Betacyanins are obtained mainly from the red beet of Beta vulgaris plant (between I 0 to 20 mg per I 00 g pulp) but alternative primary sources are needed [3]. In addition, independently of the source used, the effect of the variables that affect the extraction of betacyanins have not been properly described and quantified. Therefore, the aim of this study was to identifY and optimize the conditions that maximize betacyanins extraction using the tepals of Gomphrena globosa L. flowers as an alternative source. Assisted by the statistical technique of response surface methodology, an experimental design was developed for testing the significant explanatory variables of the extraction (time, temperature, solid-liquid ratio and ethanolwater ratio). The identification was performed using high-performance liquid chromatography coupled with a photodiode array detector and mass spectrometry with electron spray ionization (HPLC-PDAMS/ ESI) and the response was measured by the quantification of these compounds using HPLC-PDA. Afterwards, a response surface analysis was performed to evaluate the results. The major betacyanin compounds identified were gomphrenin 11 and Ill and isogomphrenin IJ and Ill. The highest total betacyanins content was obtained by using the following conditions: 45 min of extraction. time, 35•c, 35 g/L of solid-liquid ratio and 25% of ethanol. These values would not be found without optimizing the conditions of the betacyanins extraction, which moreover showed contrary trends to what it has been described in the scientific bibliography. More specifically, concerning the time and temperature variables, an increase of both values (from the common ones used in the bibliography) showed a considerable improvement on the betacyanins extraction yield without displaying any type of degradation patterns.
Resumo:
Statistical approaches to study extreme events require, by definition, long time series of data. In many scientific disciplines, these series are often subject to variations at different temporal scales that affect the frequency and intensity of their extremes. Therefore, the assumption of stationarity is violated and alternative methods to conventional stationary extreme value analysis (EVA) must be adopted. Using the example of environmental variables subject to climate change, in this study we introduce the transformed-stationary (TS) methodology for non-stationary EVA. This approach consists of (i) transforming a non-stationary time series into a stationary one, to which the stationary EVA theory can be applied, and (ii) reverse transforming the result into a non-stationary extreme value distribution. As a transformation, we propose and discuss a simple time-varying normalization of the signal and show that it enables a comprehensive formulation of non-stationary generalized extreme value (GEV) and generalized Pareto distribution (GPD) models with a constant shape parameter. A validation of the methodology is carried out on time series of significant wave height, residual water level, and river discharge, which show varying degrees of long-term and seasonal variability. The results from the proposed approach are comparable with the results from (a) a stationary EVA on quasi-stationary slices of non-stationary series and (b) the established method for non-stationary EVA. However, the proposed technique comes with advantages in both cases. For example, in contrast to (a), the proposed technique uses the whole time horizon of the series for the estimation of the extremes, allowing for a more accurate estimation of large return levels. Furthermore, with respect to (b), it decouples the detection of non-stationary patterns from the fitting of the extreme value distribution. As a result, the steps of the analysis are simplified and intermediate diagnostics are possible. In particular, the transformation can be carried out by means of simple statistical techniques such as low-pass filters based on the running mean and the standard deviation, and the fitting procedure is a stationary one with a few degrees of freedom and is easy to implement and control. An open-source MAT-LAB toolbox has been developed to cover this methodology, which is available at https://github.com/menta78/tsEva/(Mentaschi et al., 2016).
Resumo:
Knee osteoarthritis is the most common type of arthritis and a major cause of impaired mobility and disability for the ageing populations. Therefore, due to the increasing prevalence of the malady, it is expected that clinical and scientific practices had to be set in order to detect the problem in its early stages. Thus, this work will be focused on the improvement of methodologies for problem solving aiming at the development of Artificial Intelligence based decision support system to detect knee osteoarthritis. The framework is built on top of a Logic Programming approach to Knowledge Representation and Reasoning, complemented with a Case Based approach to computing that caters for the handling of incomplete, unknown, or even self-contradictory information.