961 resultados para Scientific Community
Resumo:
Aquest estudi consisteix en l’anàlisi de les practiques educatives que porta a terme una escola pública d’Educació Infantil i Primària, en relació a les estratègies d’ensenyament i aprenentatge de les ciències. Per aquest motiu, es presenta què proposa actualment la recerca educativa sobre l’ensenyament i aprenentatge de les ciències, la qual emfatitza en la necessitat de situar l’infant en el centre del seu procés educatiu, partint del seu coneixement intuïtiu, per tal que pensi, faci i comuniqui d’una manera similar a la que segueix la comunitat científica. Aquesta manera d’entendre l’educació científica és, segons els estudis actuals, condició sine qua non per a que l’alumne desenvolupi la competència científica, és a dir, aprengui ciència, aprenent com funciona la ciència i aprenent sobre la ciència. En base a aquesta teoria, s’han portat a terme observacions directes a diferents cursos, les quals s’han recollit en un diari d’observacions, per tractar d’analitzar com el centre escolar desenvolupa la pràctica educativa de les ciències en el seu dia a dia.
Resumo:
The computer simulation of reaction dynamics has nowadays reached a remarkable degree of accuracy. Triatomic elementary reactions are rigorously studied with great detail on a straightforward basis using a considerable variety of Quantum Dynamics computational tools available to the scientific community. In our contribution we compare the performance of two quantum scattering codes in the computation of reaction cross sections of a triatomic benchmark reaction such as the gas phase reaction Ne + H2+ %12. NeH++ H. The computational codes are selected as representative of time-dependent (Real Wave Packet [ ]) and time-independent (ABC [ ]) methodologies. The main conclusion to be drawn from our study is that both strategies are, to a great extent, not competing but rather complementary. While time-dependent calculations advantages with respect to the energy range that can be covered in a single simulation, time-independent approaches offer much more detailed information from each single energy calculation. Further details such as the calculation of reactivity at very low collision energies or the computational effort related to account for the Coriolis couplings are analyzed in this paper.
Resumo:
Describimos el uso de estrategias de atenuación en 40 artículos de revisión (AR), publicados en español en revistas iberoamericanas entre 1994 y 2004. Identificamos las estrategias empleadas en las secciones retóricas por medio de un análisis contextual de género y clasificamos los atenuantes en cinco categorías: construcciones impersonales, deícticos temporales, aproximadores, escudos y atenuantes compuestos. Los resultados muestran que hay abundantes y variadas estrategias de atenuación en las tres secciones retóricas del AR, aunque son más frecuentes en el desarrollo y en la introducción. Predomina el modal epistémico “poder”, los adverbios y los adjetivos de posibilidad y probabilidad, y los verbos epistémicos. Registramos varias construcciones impersonales empleadas para atenuar. Los aproximadores se usan para expresar honestidad y varios niveles de certidumbre en las proposiciones; los deícticos temporales, para expresar provisionalidad y sugerir varias interpretaciones. Concluimos que la atenuación puede estar relacionada con la posición de los autores en la comunidad científica y con las características del AR como género discursivo. Asimismo, el nivel de expectativa en la escritura de este género podría condicionar cómo se presentan las proposiciones.
Resumo:
At a time when disciplined inference and decision making under uncertainty represent common aims to participants in legal proceedings, the scientific community is remarkably heterogenous in its attitudes as to how these goals ought to be achieved. Probability and decision theory exert a considerable influence, and we think by all reason rightly do so, but they go against a mainstream of thinking that does not embrace-or is not aware of-the 'normative' character of this body of theory. It is normative, in the sense understood in this article, in that it prescribes particular properties, typically (logical) coherence, to which reasoning and decision making ought to conform. Disregarding these properties can result in diverging views which are occasionally used as an argument against the theory, or as a pretext for not following it. Typical examples are objections according to which people, both in everyday life but also individuals involved at various levels in the judicial process, find the theory difficult to understand and to apply. A further objection is that the theory does not reflect how people actually behave. This article aims to point out in what sense these examples misinterpret the analytical framework in its normative perspective. Through examples borrowed mostly from forensic science contexts, it is argued that so-called intuitive scientific attitudes are particularly liable to such misconceptions. These attitudes are contrasted with a statement of the actual liberties and constraints of probability and decision theory and the view according to which this theory is normative.
Resumo:
BACKGROUND: Until recently, neurosurgeons eagerly removed cerebellar lesions without consideration of future cognitive impairment that might be caused by the resection. In children, transient cerebellar mutism after resection has lead to a diminished use of midline approaches and vermis transection, as well as reduced retraction of the cerebellar hemispheres. The role of the cerebellum in higher cognitive functions beyond coordination and motor control has recently attracted significant interest in the scientific community, and might change the neurosurgical approach to these lesions. The aim of this study was to investigate the specific effects of cerebellar lesions on memory, and to assess a possible lateralisation effect. METHODS: We studied 16 patients diagnosed with a cerebellar lesion, from January 1997 to April 2005, in the "Centre Hospitalier Universitaire Vaudois (CHUV)", Lausanne, Switzerland. Different neuropsychological tests assessing short term and anterograde memory, verbal and visuo-spatial modalities were performed pre-operatively. RESULTS: Severe memory deficits in at least one modality were identified in a majority (81%) of patients with cerebellar lesions. Only 1 patient (6%) had no memory deficit. In our series lateralisation of the lesion did not lead to a significant difference in verbal or visuo-spatial memory deficits. FINDINGS: These findings are consistent with findings in the literature concerning memory deficits in isolated cerebellar lesions. These can be explained by anatomical pathways. However, the cross-lateralisation theory cannot be demonstrated in our series. The high percentage of patients with a cerebellar lesion who demonstrate memory deficits should lead us to assess memory in all patients with cerebellar lesions.
Resumo:
The European Space Agency's Gaia mission will create the largest and most precise three dimensional chart of our galaxy (the Milky Way), by providing unprecedented position, parallax, proper motion, and radial velocity measurements for about one billion stars. The resulting catalogue will be made available to the scientific community and will be analyzed in many different ways, including the production of a variety of statistics. The latter will often entail the generation of multidimensional histograms and hypercubes as part of the precomputed statistics for each data release, or for scientific analysis involving either the final data products or the raw data coming from the satellite instruments. In this paper we present and analyze a generic framework that allows the hypercube generation to be easily done within a MapReduce infrastructure, providing all the advantages of the new Big Data analysis paradigmbut without dealing with any specific interface to the lower level distributed system implementation (Hadoop). Furthermore, we show how executing the framework for different data storage model configurations (i.e. row or column oriented) and compression techniques can considerably improve the response time of this type of workload for the currently available simulated data of the mission. In addition, we put forward the advantages and shortcomings of the deployment of the framework on a public cloud provider, benchmark against other popular solutions available (that are not always the best for such ad-hoc applications), and describe some user experiences with the framework, which was employed for a number of dedicated astronomical data analysis techniques workshops.
Resumo:
The European Space Agency's Gaia mission will create the largest and most precise three dimensional chart of our galaxy (the Milky Way), by providing unprecedented position, parallax, proper motion, and radial velocity measurements for about one billion stars. The resulting catalogue will be made available to the scientific community and will be analyzed in many different ways, including the production of a variety of statistics. The latter will often entail the generation of multidimensional histograms and hypercubes as part of the precomputed statistics for each data release, or for scientific analysis involving either the final data products or the raw data coming from the satellite instruments. In this paper we present and analyze a generic framework that allows the hypercube generation to be easily done within a MapReduce infrastructure, providing all the advantages of the new Big Data analysis paradigmbut without dealing with any specific interface to the lower level distributed system implementation (Hadoop). Furthermore, we show how executing the framework for different data storage model configurations (i.e. row or column oriented) and compression techniques can considerably improve the response time of this type of workload for the currently available simulated data of the mission. In addition, we put forward the advantages and shortcomings of the deployment of the framework on a public cloud provider, benchmark against other popular solutions available (that are not always the best for such ad-hoc applications), and describe some user experiences with the framework, which was employed for a number of dedicated astronomical data analysis techniques workshops.
Resumo:
Crops and forests are already responding to rising atmospheric carbon dioxide and air temperatures. Increasing atmospheric CO2 concentrations are expected to enhance plant photosynthesis. Nevertheless, after long-term exposure, plants acclimate and show a reduction in photosynthetic activity (i.e. down-regulation). If in the future the Earth"s temperature is allowed to rise further, plant ecosystems and food security will both face significant threats. The scientific community has recognized that an increase in global temperatures should remain below 2°C in order to combat climate change. All this evidence suggests that, in parallel with reductions in CO2 emissions, a more direct approach to mitigate global warming should be considered. We propose here that global warming could be partially mitigated directly through local bio-geoengineering approaches. For example, this could be done through the management of solar radiation at surface level, i.e. by increasing global albedo. Such an effect has been documented in the south-eastern part of Spain, where a significant surface air temperature trend of -0.3°C per decade has been observed due to a dramatic expansion of greenhouse horticulture.
Resumo:
Although approximately 50% of Down Syndrome (DS) patients have heart abnormalities, they exhibit an overprotection against cardiac abnormalities related with the connective tissue, for example a lower risk of coronary artery disease. A recent study reported a case of a person affected by DS who carried mutations in FBN1, the gene causative for a connective tissue disorder called Marfan Syndrome (MFS). The fact that the person did not have any cardiac alterations suggested compensation effects due to DS. This observation is supported by a previous DS meta-analysis at the molecular level where we have found an overall upregulation of FBN1 (which is usually downregulated in MFS). Additionally, that result was cross-validated with independent expression data from DS heart tissue. The aim of this work is to elucidate the role of FBN1 in DS and to establish a molecular link to MFS and MFS-related syndromes using a computational approach. To reach that, we conducted different analytical approaches over two DS studies (our previous meta-analysis and independent expression data from DS heart tissue) and revealed expression alterations in the FBN1 interaction network, in FBN1 co-expressed genes and FBN1-related pathways. After merging the significant results from different datasets with a Bayesian approach, we prioritized 85 genes that were able to distinguish control from DS cases. We further found evidence for several of these genes (47%), such as FBN1, DCN, and COL1A2, being dysregulated in MFS and MFS-related diseases. Consequently, we further encourage the scientific community to take into account FBN1 and its related network for the study of DS cardiovascular characteristics.
Resumo:
The possible connection between chronic oral inflammatory processes, such as apical periodontitis and periodontal disease (PD), and systemic health is one of the most interesting aspects faced by the medical and dental scientific community. Chronic apical periodontitis shares important characteristics with PD: 1) both are chronic infections of the oral cavity, 2) the Gram-negative anaerobic microbiota found in both diseases is comparable, and 3) in both infectious processes increased local levels of inflammatory mediators may have an impact on systemic levels. One of the systemic disorders linked to PD is diabetes mellitus (DM); is therefore plausible to assume that chronic apical periodontitis and endodontic treatment are also associated with DM. The status of knowledge regarding the relationship between DM and endodontics is reviewed. Upon review, we conclude that there are data in the literature that associate DM with a higher prevalence of periapical lesions, greater size of the osteolityc lesions, greater likelihood of asymptomatic infections and worse prognosis for root filled teeth. The results of some studies suggest that periapical disease may contribute to diabetic metabolic dyscontrol
Resumo:
Despite global environmental governance has traditionally couched global warming in terms of annual CO2 emissions (a flow), global mean temperature is actually determined by cumulative CO2 emissions in the atmosphere (a stock). Thanks to advances of scientific community, nowadays it is possible to quantify the \global carbon budget", that is, the amount of available cumulative CO2 emissions before crossing the 2oC threshold (Meinshausen et al., 2009). The current approach proposes to analyze the allocation of such global carbon budget among countries as a classical conflicting claims problem (O'Neill, 1982). Based on some appealing principles, it is proposed an efficient and sustainable allocation of the available carbon budget from 2000 to 2050 taking into account different environmental risk scenarios. Keywords: Carbon budget, Conflicting claims problem, Distribution, Climate change. JEL classification: C79, D71, D74, H41, H87, Q50, Q54, Q58.
Resumo:
From its beginning, the field of questioned documents has been concerned with dating. Proposed methods usually lean upon complex processes, and controversy among the scientific community is still high. Every document dating method whose objective is to be applied in forensic caseworks must fulfill validation requirements. Moreover, source inference must also be taken into account in the interpretation of the dating evidence. To date, most methods still fail to be adequately validated, and should be applied with extreme caution. The limitations of the methods used must be adequately disclosed and documented.
Resumo:
For more than a decade scientists tried to develop methods capable of dating ink by monitoring the loss of phenoxyethanol (PE) over time. While many methods were proposed in the literature, few were really used to solve practical cases and they still raise much concern within the scientific community. In fact, due to the complexity of ink drying processes it is particularly difficult to find a reliable ageing parameter to reproducibly follow ink ageing. Moreover, systematic experiments are required in order to evaluate how different factors actually influence the results over time. Therefore, this work aimed at evaluating the capacity of four different ageing parameters to reliably follow ink ageing over time: (1) the quantity of solvent PE in an ink line, (2) the relative peak area (RPA) normalising the PE results using stable volatile compounds present in the ink formulation, (3) the solvent loss ratio (R%) calculated from PE results obtained by the analyses of naturally and artificially aged samples, (4) a modified solvent loss ratio version (R%*) calculated from RPA results. After the determination of the limits of reliable measurements of the analytical method, the repeatability of the different ageing parameters was evaluated over time, as well as the influence of ink composition, writing pressure and storage conditions on the results. Surprisingly, our results showed that R% was not the most reliable parameter, as it showed the highest standard deviation. Discussion of the results in an ink dating perspective suggests that other proposed parameters, such as RPA values, may be more adequate to follow ink ageing over time.
Resumo:
SEPServer is a three-year collaborative project funded by the seventh framework programme (FP7-SPACE) of the European Union. The objective of the project is to provide access to state-of-the-art observations and analysis tools for the scientific community on solar energetic particle (SEP) events and related electromagnetic (EM) emissions. The project will eventually lead to better understanding of the particle acceleration and transport processes at the Sun and in the inner heliosphere. These processes lead to SEP events that form one of the key elements of space weather. In this paper we present the first results from the systematic analysis work performed on the following datasets: SOHO/ERNE, SOHO/EPHIN, ACE/EPAM, Wind/WAVES and GOES X-rays. A catalogue of SEP events at 1 AU, with complete coverage over solar cycle 23, based on high-energy (~68-MeV) protons from SOHO/ERNE and electron recordings of the events by SOHO/EPHIN and ACE/EPAM are presented. A total of 115 energetic particle events have been identified and analysed using velocity dispersion analysis (VDA) for protons and time-shifting analysis (TSA) for electrons and protons in order to infer the SEP release times at the Sun. EM observations during the times of the SEP event onset have been gathered and compared to the release time estimates of particles. Data from those events that occurred during the European day-time, i.e., those that also have observations from ground-based observatories included in SEPServer, are listed and a preliminary analysis of their associations is presented. We find that VDA results for protons can be a useful tool for the analysis of proton release times, but if the derived proton path length is out of a range of 1 AU < s[3 AU, the result of the analysis may be compromised, as indicated by the anti-correlation of the derived path length and release time delay from the asso ciated X-ray flare. The average path length derived from VDA is about 1.9 times the nominal length of the spiral magnetic field line. This implies that the path length of first-arriving MeV to deka-MeV protons is affected by interplanetary scattering. TSA of near-relativistic electrons results in a release time that shows significant scatter with respect to the EM emissions but with a trend of being delayed more with increasing distance between the flare and the nominal footpoint of the Earth-connected field line.
Resumo:
For more than a decade scientists tried to develop methods capable of dating ink by monitoring the loss of phenoxyethanol (PE) over time. While many methods were proposed in the literature, few were really used to solve practical cases and they still raise much concern within the scientific community. In fact, due to the complexity of ink drying processes it is particularly difficult to find a reliable ageing parameter to reproducibly follow ink ageing. Moreover, systematic experiments are required in order to evaluate how different factors actually influence the results over time. Therefore, this work aimed at evaluating the capacity of four different ageing parameters to reliably follow ink ageing over time: (1) the quantity of solvent PE in an ink line, (2) the relative peak area (RPA) normalising the PE results using stable volatile compounds present in the ink formulation, (3) the solvent loss ratio (R%) calculated from PE results obtained by the analyses of naturally and artificially aged samples, (4) a modified solvent loss ratio version (R%*) calculated from RPA results. After the determination of the limits of reliable measurements of the analytical method, the repeatability of the different ageing parameters was evaluated over time, as well as the influence of ink composition, writing pressure and storage conditions on the results. Surprisingly, our results showed that R% was not the most reliable parameter, as it showed the highest standard deviation. Discussion of the results in an ink dating perspective suggests that other proposed parameters, such as RPA values, may be more adequate to follow ink ageing over time.