972 resultados para Science methodology
Resumo:
Amulti-residue methodology based on a solid phase extraction followed by gas chromatography–tandem mass spectrometry was developed for trace analysis of 32 compounds in water matrices, including estrogens and several pesticides from different chemical families, some of them with endocrine disrupting properties. Matrix standard calibration solutions were prepared by adding known amounts of the analytes to a residue-free sample to compensate matrix-induced chromatographic response enhancement observed for certain pesticides. Validation was done mainly according to the International Conference on Harmonisation recommendations, as well as some European and American validation guidelines with specifications for pesticides analysis and/or GC–MS methodology. As the assumption of homoscedasticity was not met for analytical data, weighted least squares linear regression procedure was applied as a simple and effective way to counteract the greater influence of the greater concentrations on the fitted regression line, improving accuracy at the lower end of the calibration curve. The method was considered validated for 31 compounds after consistent evaluation of the key analytical parameters: specificity, linearity, limit of detection and quantification, range, precision, accuracy, extraction efficiency, stability and robustness.
Resumo:
Serious games are starting to attain a higher role as tools for learning in various contexts, but in particular in areas such as education and training. Due to its characteristics, such as rules, behavior simulation and feedback to the player's actions, serious games provide a favorable learning environment where errors can occur without real life penalty and students get instant feedback from challenges. These challenges are in accordance with the intended objectives and will self-adapt and repeat according to the student’s difficulty level. Through motivating and engaging environments, which serve as base for problem solving and simulation of different situations and contexts, serious games have a great potential to aid players developing professional skills. But, how do we certify the acquired knowledge and skills? With this work we intend to propose a methodology to establish a relationship between the game mechanics of serious games and an array of competences for certification, evaluating the applicability of various aspects in the design and development of games such as the user interfaces and the gameplay, obtaining learning outcomes within the game itself. Through the definition of game mechanics combined with the necessary pedagogical elements, the game will ensure the certification. This paper will present a matrix of generic skills, based on the European Framework of Qualifications, and the definition of the game mechanics necessary for certification on tour guide training context. The certification matrix has as reference axes: skills, knowledge and competencies, which describe what the students should learn, understand and be able to do after they complete the learning process. The guides-interpreters welcome and accompany tourists on trips and visits to places of tourist interest and cultural heritage such as museums, palaces and national monuments, where they provide various information. Tour guide certification requirements include skills and specific knowledge about foreign languages and in the areas of History, Ethnology, Politics, Religion, Geography and Art of the territory where it is inserted. These skills are communication, interpersonal relationships, motivation, organization and management. This certification process aims to validate the skills to plan and conduct guided tours on the territory, demonstrate knowledge appropriate to the context and finally match a good group leader. After defining which competences are to be certified, the next step is to delineate the expected learning outcomes, as well as identify the game mechanics associated with it. The game mechanics, as methods invoked by agents for interaction with the game world, in combination with game elements/objects allows multiple paths through which to explore the game environment and its educational process. Mechanics as achievements, appointments, progression, reward schedules or status, describe how game can be designed to affect players in unprecedented ways. In order for the game to be able to certify tour guides, the design of the training game will incorporate a set of theoretical and practical tasks to acquire skills and knowledge of various transversal themes. For this end, patterns of skills and abilities in acquiring different knowledge will be identified.
Resumo:
Learning and teaching processes, like all human activities, can be mediated through the use of tools. Information and communication technologies are now widespread within education. Their use in the daily life of teachers and learners affords engagement with educational activities at any place and time and not necessarily linked to an institution or a certificate. In the absence of formal certification, learning under these circumstances is known as informal learning. Despite the lack of certification, learning with technology in this way presents opportunities to gather information about and present new ways of exploiting an individual’s learning. Cloud technologies provide ways to achieve this through new architectures, methodologies, and workflows that facilitate semantic tagging, recognition, and acknowledgment of informal learning activities. The transparency and accessibility of cloud services mean that institutions and learners can exploit existing knowledge to their mutual benefit. The TRAILER project facilitates this aim by providing a technological framework using cloud services, a workflow, and a methodology. The services facilitate the exchange of information and knowledge associated with informal learning activities ranging from the use of social software through widgets, computer gaming, and remote laboratory experiments. Data from these activities are shared among institutions, learners, and workers. The project demonstrates the possibility of gathering information related to informal learning activities independently of the context or tools used to carry them out.
Resumo:
This study focused on the development of a sensitive enzymatic biosensor for the determination of pirimicarb pesticide based on the immobilization of laccase on composite carbon paste electrodes. Multi- walled carbon nanotubes(MWCNTs)paste electrode modified by dispersion of laccase(3%,w/w) within the optimum composite matrix(60:40%,w/w,MWCNTs and paraffin binder)showed the best performance, with excellent electron transfer kinetic and catalytic effects related to the redox process of the substrate4- aminophenol. No metal or anti-interference membrane was added. Based on the inhibition of laccase activity, pirimicarb can be determined in the range 9.90 ×10- 7 to 1.15 ×10- 5 molL 1 using 4- aminophenol as substrate at the optimum pH of 5.0, with acceptable repeatability and reproducibility (relative standard deviations lower than 5%).The limit of detection obtained was 1.8 × 10-7 molL 1 (0.04 mgkg 1 on a fresh weight vegetable basis).The high activity and catalytic properties of the laccase- based biosensor are retained during ca. one month. The optimized electroanalytical protocol coupled to the QuEChERS methodology were applied to tomato and lettuce samples spiked at three levels; recoveries ranging from 91.0±0.1% to 101.0 ± 0.3% were attained. No significant effects in the pirimicarb electro- analysis were observed by the presence of pro-vitamin A, vitamins B1 and C,and glucose in the vegetable extracts. The proposed biosensor- based pesticide residue methodology fulfills all requisites to be used in implementation of food safety programs.
Resumo:
A novel enzymatic biosensor for carbamate pesticides detection was developed through the direct immobilization of Trametes versicolor laccase on graphene doped carbon paste electrode functionalized with Prussianblue films (LACC/PB/GPE). Graphene was prepared by graphite sonication-assisted exfoliation and characterized by transmission electron microscopy and X-ray photoelectron spectro- scopy. The Prussian blue film electrodeposited onto graphene doped carbon paste electrode allowed considerable reduction of the charge transfer resistance and of the capacitance of the device.The combined effects of pH, enzyme concentration and incubation time on biosensor response were optimized using a 23 full-factorial statistical design and response surface methodology. Based on the inhibition of laccase activity and using 4-aminophenol as redox mediator at pH 5.0,LACC/PB/GPE exhibited suitable characteristics in terms of sensitivity, intra-and inter-day repeatability (1.8–3.8% RSD), reproducibility (4.1 and 6.3%RSD),selectivity(13.2% bias at the higher interference: substrate ratios tested),accuracy and stability(ca. twenty days)for quantification of five carbamates widely applied on tomato and potato crops.The attained detection limits ranged between 5.2×10−9 mol L−1(0.002 mg kg−1 w/w for ziram)and 1.0×10−7 mol L−1 (0.022 mg kg−1 w/w for carbofuran).Recovery values for the two tested spiking levels ranged from 90.2±0.1%(carbofuran)to 101.1±0.3% (ziram) for tomato and from 91.0±0.1%(formetanate)to 100.8±0.1%(ziram)for potato samples.The proposed methodology is appropriate to enable testing pesticide levels in food samples to fit with regulations and food inspections.
Resumo:
An analytical method using microwave-assisted extraction (MAE) and liquid chromatography (LC) with fluorescence detection (FD) for the determination of ochratoxin A (OTA) in bread samples is described. A 24 orthogonal composite design coupled with response surface methodology was used to study the influence of MAE parameters (extraction time, temperature, solvent volume, and stirring speed) in order to maximize OTA recovery. The optimized MAE conditions were the following: 25 mL of acetonitrile, 10 min of extraction, at 80 °C, and maximum stirring speed. Validation of the overall methodology was performed by spiking assays at five levels (0.1–3.00 ng/g). The quantification limit was 0.005 ng/g. The established method was then applied to 64 bread samples (wheat, maize, and wheat/maize bread) collected in Oporto region (Northern Portugal). OTAwas detected in 84 % of the samples with a maximum value of 2.87 ng/g below the European maximum limit established for OTA in cereal products of 3 ng/g.
Resumo:
This paper presents a modified Particle Swarm Optimization (PSO) methodology to solve the problem of energy resources management with high penetration of distributed generation and Electric Vehicles (EVs) with gridable capability (V2G). The objective of the day-ahead scheduling problem in this work is to minimize operation costs, namely energy costs, regarding he management of these resources in the smart grid context. The modifications applied to the PSO aimed to improve its adequacy to solve the mentioned problem. The proposed Application Specific Modified Particle Swarm Optimization (ASMPSO) includes an intelligent mechanism to adjust velocity limits during the search process, as well as self-parameterization of PSO parameters making it more user-independent. It presents better robustness and convergence characteristics compared with the tested PSO variants as well as better constraint handling. This enables its use for addressing real world large-scale problems in much shorter times than the deterministic methods, providing system operators with adequate decision support and achieving efficient resource scheduling, even when a significant number of alternative scenarios should be considered. The paper includes two realistic case studies with different penetration of gridable vehicles (1000 and 2000). The proposed methodology is about 2600 times faster than Mixed-Integer Non-Linear Programming (MINLP) reference technique, reducing the time required from 25 h to 36 s for the scenario with 2000 vehicles, with about one percent of difference in the objective function cost value.
Resumo:
In the last few years, the number of systems and devices that use voice based interaction has grown significantly. For a continued use of these systems, the interface must be reliable and pleasant in order to provide an optimal user experience. However there are currently very few studies that try to evaluate how pleasant is a voice from a perceptual point of view when the final application is a speech based interface. In this paper we present an objective definition for voice pleasantness based on the composition of a representative feature subset and a new automatic voice pleasantness classification and intensity estimation system. Our study is based on a database composed by European Portuguese female voices but the methodology can be extended to male voices or to other languages. In the objective performance evaluation the system achieved a 9.1% error rate for voice pleasantness classification and a 15.7% error rate for voice pleasantness intensity estimation.
Resumo:
In this study, efforts were made in order to put forward an integrated recycling approach for the thermoset based glass fibre reinforced polymer (GPRP) rejects derived from the pultrusion manufacturing industry. Both the recycling process and the development of a new cost-effective end-use application for the recyclates were considered. For this purpose, i) among the several available recycling techniques for thermoset based composite materials, the most suitable one for the envisaged application was selected (mechanical recycling); and ii) an experimental work was carried out in order to assess the added-value of the obtained recyclates as aggregates and reinforcement replacements into concrete-polymer composite materials. Potential recycling solution was assessed by mechanical behaviour of resultant GFRP waste modified concrete-polymer composites with regard to unmodified materials. In the mix design process of the new GFRP waste based composite material, the recyclate content and size grade, and the effect of the incorporation of an adhesion promoter were considered as material factors and systematically tested between reasonable ranges. The optimization process of the modified formulations was supported by the Fuzzy Boolean Nets methodology, which allowed finding the best balance between material parameters that maximizes both flexural and compressive strengths of final composite. Comparing to related end-use applications of GFRP wastes in cementitious based concrete materials, the proposed solution overcome some of the problems found, namely the possible incompatibilities arisen from alkalis-silica reaction and the decrease in the mechanical properties due to high water-cement ratio required to achieve the desirable workability. Obtained results were very promising towards a global cost-effective waste management solution for GFRP industrial wastes and end-of-life products that will lead to a more sustainable composite materials industry.
Resumo:
Geostatistics has been successfully used to analyze and characterize the spatial variability of environmental properties. Besides giving estimated values at unsampled locations, it provides a measure of the accuracy of the estimate, which is a significant advantage over traditional methods used to assess pollution. In this work universal block kriging is novelty used to model and map the spatial distribution of salinity measurements gathered by an Autonomous Underwater Vehicle in a sea outfall monitoring campaign, with the aim of distinguishing the effluent plume from the receiving waters, characterizing its spatial variability in the vicinity of the discharge and estimating dilution. The results demonstrate that geostatistical methodology can provide good estimates of the dispersion of effluents that are very valuable in assessing the environmental impact and managing sea outfalls. Moreover, since accurate measurements of the plume’s dilution are rare, these studies might be very helpful in the future to validate dispersion models.
Resumo:
This study deals with the problem of how to collect genuine and useful data about science classroom practices, and preserving the complex and holistic nature of teaching and learning. Additionally, we were looking for an instrument that would allow comparability and verifiability for teaching and research purposes. Given the multimodality of teaching and learning processes, we developed the multimodal narrative (MN), which describes what happens during a task and incorporates data such as examples of students’ work.
Resumo:
Aims: Obesity and asthma are widely prevalent and associated disorders. Recent studies of our group revealed that Substance P (SP) is involved in pathophysiology of obese-asthma phenotype in mice through its selective NK1 receptor (NK1-R). Lymphangiogenesis is impaired in asthma and obesity, and SP activates contractile and inflammatory pathways in lymphatics. Our aim was to study whether NK1-R expression was involved in lymphangiogenesis on visceral (VAT) and subcutaneous (SAT) adipose tissues and in the lungs, in obeseallergen sensitized mice. Main methods: Diet-induced obese and ovalbumin (OVA)-sensitized Balb/c mice were treated with a selective NK1-R antagonist (CJ 12,255, Pfizer Inc., USA) or placebo. Lymphatic structures (LYVE-1+) and NK1-R expression were analyzed by immunohistochemistry. A semi-quantitative score methodology was used for NK1-R expression. Key findings: Obesity and allergen-sensitization together increased the number of LYVE-1+ lymphatics in VAT and decreased it in SAT and lungs. NK1-R was mainly expressed on adipocyte membranes of VAT, blood vessel areas of SAT, and in lung epithelium. Obesity and allergen-sensitization combined increased the expression of NK1-R in VAT, SAT and lungs. NK1-R antagonist treatment reversed the effects observed in lymphangiogenesis in those tissues. Significance: The obese-asthma phenotype in mice is accompanied by increased expression of NK1-R on adipose tissues and lung epithelium, reflecting that SP released during inflammation may act directly on these tissues. Blocking NK1-R affects lymphangiogenesis, implying a role of SP, with opposite physiological consequences in VAT, and in SAT and lungs. Our results provide a clue for a novel SP role in the obese-asthma phenotype.
RadiaLE: A framework for designing and assessing link quality estimators in wireless sensor networks
Resumo:
Stringent cost and energy constraints impose the use of low-cost and low-power radio transceivers in large-scale wireless sensor networks (WSNs). This fact, together with the harsh characteristics of the physical environment, requires a rigorous WSN design. Mechanisms for WSN deployment and topology control, MAC and routing, resource and mobility management, greatly depend on reliable link quality estimators (LQEs). This paper describes the RadiaLE framework, which enables the experimental assessment, design and optimization of LQEs. RadiaLE comprises (i) the hardware components of the WSN testbed and (ii) a software tool for setting-up and controlling the experiments, automating link measurements gathering through packets-statistics collection, and analyzing the collected data, allowing for LQEs evaluation. We also propose a methodology that allows (i) to properly set different types of links and different types of traffic, (ii) to collect rich link measurements, and (iii) to validate LQEs using a holistic and unified approach. To demonstrate the validity and usefulness of RadiaLE, we present two case studies: the characterization of low-power links and a comparison between six representative LQEs. We also extend the second study for evaluating the accuracy of the TOSSIM 2 channel model.
Resumo:
Desertification is a critical issue for Mediterranean drylands. Climate change is expected to aggravate its extension and severity by reinforcing the biophysical driving forces behind desertification processes: hydrology, vegetation cover and soil erosion. The main objective of this thesis is to assess the vulnerability of Mediterranean watersheds to climate change, by estimating impacts on desertification drivers and the watersheds’ resilience to them. To achieve this objective, a modeling framework capable of analyzing the processes linking climate and the main drivers is developed. The framework couples different models adapted to different spatial and temporal scales. A new model for the event scale is developed, the MEFIDIS model, with a focus on the particular processes governing Mediterranean watersheds. Model results are compared with desertification thresholds to estimate resilience. This methodology is applied to two contrasting study areas: the Guadiana and the Tejo, which currently present a semi-arid and humid climate. The main conclusions taken from this work can be summarized as follows: • hydrological processes show a high sensitivity to climate change, leading to a significant decrease in runoff and an increase in temporal variability; • vegetation processes appear to be less sensitive, with negative impacts for agricultural species and forests, and positive impacts for Mediterranean species; • changes to soil erosion processes appear to depend on the balance between changes to surface runoff and vegetation cover, itself governed by relationship between changes to temperature and rainfall; • as the magnitude of changes to climate increases, desertification thresholds are surpassed in a sequential way, starting with the watersheds’ ability to sustain current water demands and followed by the vegetation support capacity; • the most important thresholds appear to be a temperature increase of +3.5 to +4.5 ºC and a rainfall decrease of -10 to -20 %; • rainfall changes beyond this threshold could lead to severe water stress occurring even if current water uses are moderated, with droughts occurring in 1 out of 4 years; • temperature changes beyond this threshold could lead to a decrease in agricultural yield accompanied by an increase in soil erosion for croplands; • combined changes of temperature and rainfall beyond the thresholds could shift both systems towards a more arid state, leading to severe water stresses and significant changes to the support capacity for current agriculture and natural vegetation in both study areas.
Resumo:
This paper presents part of a study that aimed to understand how the emergence of algebraic thinking takes place in a group of four-year-old children, as well as its relationship to the exploration of children‘s literature. To further deepen and guide this study the following research questions were formulated: (1) How can children's literature help preschoolers identify patterns?; (2) What strategies and thinking processes do children use to create, analyze and generalize repeating and growing patterns?; (3) What strategies do children use to identify the unit of repeat of a pattern? and (4) What factors influence the identification of patterns? The paper focuses only on the strategies and thinking processes that children use to create, analyze and generalize repeating patterns. The present study was developed with a group of 14 preschoolers in a private school in Lisbon, and it was carried out with all children. In order to develop the research, a qualitative research methodology under the interpretive paradigm was chosen, emphasizing meanings and processes. The researcher took the dual role of teacher-researcher, conducting the study with her own group and in her own natural environment. Participant observation and document analysis (audio and video recordings, photos and children productions) were used as data collection methods. Data collection took place from October 2013 to April 2014. The results of the study indicate that children master the concept of repeating patterns, and they are able to identify the unit of repeat, create and analyze various repeating patterns, evolving from simpler to more complex forms.