977 resultados para science standards
Resumo:
Teacher commitment has been found to be a critical predictor of teachers’ work performance, absenteeism, retention, burnout and turnover, as well as having an important influence on students’ motivation, achievement, attitudes towards learning and being at school (Firestone (1996). Educational Administration Quarterly, 32(2), 209–235; Graham (1996). Journal of Physical Education, Recreation and Dance, 67(1), 45–47; Louis (1998). School Effectiveness and School Improvement, 9(1), 1–27; Tsui & Cheng (1999). Educational Research and Evaluation, 5(3), 249–268). It is also a necessary ingredient to the successful implementation, adaptation or resistance reform agendas. Surprisingly, however, the relationship between teachers’ motivation, efficacy, job satisfaction and commitment, and between commitment and the quality of their work has not been the subject of extensive research. Some literature presents commitment as a feature of being and behaving as a professional (Helsby, Knight, McCulloch, Saunders, & Warburton (1997). A report to participants on the professional cultures of Teachers Research Project, Lancaster University, January). Others suggest that it fluctuates according to personal, institutional and policy contexts (Louis (1998). School Effectiveness and School Improvement, 9(1), 1–27) and identify different dimensions of commitment which interact and fluctuate (Tyree (1996). Journal of Educational Research, 89(5), 295–304). Others claim that teachers’ commitment tends to decrease progressively over the course of the teaching career (Fraser, Draper, & Taylor (1998). Evaluation and Research in Education, 12 (2), 61–71; Huberman (1993). The lives of teachers. London: Cassell). In this research, experienced teachers in England and Australia were interviewed about their understandings of commitment. The data suggest that commitment may be better understood as a nested phenomena at the centre of which is a set of core, relatively permanent values based upon personal beliefs, images of self, role and identity which are subject to challenge by change which is socio-politically constructed.
Resumo:
Tese de doutoramento, Ciências do Ambiente, Universidade de Lisboa, Faculdade de Ciências, Universidade Nova de Lisboa, 2015
Resumo:
Sediment is a major sink for heavy metals in river, and poses significant risks not only to river quality but also to aquatic and benthic organisms. At present in the UK, there are no mandatory sediment quality standards. This is partly due to insufficient toxicity data but also due to problems with identification of appropriate sediment monitoring and analytical techniques. The aim of this research was to examine the sampling different river sediment compartments in order to monitor compliance with any future UK sediment environmental quality standards (EQS). The significance of sediment physical and chemical characteristics on sampling and analysis was also determined. The Ravensbourne River, a tributary of the River Thames located in the highly urbanised South Eastern area of London was used for this study. Sediment was collected from the bed using the Van Veer grab, the bank using hand trowel, and from the water column (suspended sediment) using the time integrated suspended tube sampler between the period of July 2010 and December, 2011. The result for the total metal extraction carried out using aqua regia found that there were no significant differences in the metal concentrations retained in the different compartments by the <63μm sediment fraction but there were differences between the 63μm-2mm fractions of the bed and bank. The metal concentration in the bed, bank and suspended sediment exceeded the draft UK sediment quality guidelines. Sequential extraction was also carried out to determine metal speciation in each sediment compartment using the Maiz et al. (1997) and Tessier et al. (1979) methods. The Maiz et al. (1997) found over 80% of the metals in each sediment compartment were not bioavailable, while Tessier et al. (1979) method found most of the metals to be associated with the Fe/Mn and the residual phase. The bed sediment compartment and the <2mm (<63μm + 63μm-2mm) fraction appears to be the most suitable sediment sample for sediment monitoring from this study.
Resumo:
Plasma membrane-derived vesicles (PMVs) or microparticles are vesicles (0.1–1 μm in diameter) released from the plasma membrane of all blood cell types under a variety of biochemical and pathological conditions. PMVs contain cytoskeletal elements and some surface markers from the parent cell but lack a nucleus and are unable to synthesise macromolecules. They are also defined on the basis that in most cases PMVs express varying amounts of the cytosolic leaflet lipid phosphatidylserine, which is externalised during activation on their surface. This marks the PMV as a biologically distinct entity from that of its parent cell, despite containing surface markers from the original cell, and also explains its role in events such as phagocytosis and thrombosis. There is currently a large amount of variation between investigators with regard to the pre-analytical steps employed in isolating red cell PMVs or RPMVs (which are slightly smaller than most PMVs), with key differences being centrifugation and sample storage conditions, which often leads to result variability. Unfortunately, standardization of preparation and detection methods has not yet been achieved. This review highlights and critically discusses the variables contributing to differences in results obtained by investigators, bringing to light numerous studies of which RPMVs have been analysed but have not yet been the subject of a review.
Resumo:
As e-learning gradually evolved many specialized and disparate systems appeared to fulfil the needs of teachers and students, such as repositories of learning objects, authoring tools, intelligent tutors and automatic evaluators. This heterogeneity raises interoperability issues giving the standardization of content an important role in e-learning. This article presents a survey on current e-learning content aggregation standards focusing on their internal organization and packaging. This study is part of an effort to choose the most suitable specifications and standards for an e-learning framework called Ensemble defined as a conceptual tool to organize a network of e-learning systems and services for domains with complex evaluation.
Resumo:
The level of information provided by ink evidence to the criminal and civil justice system is limited. The limitations arise from the weakness of the interpretative framework currently used, as proposed in the ASTM 1422-05 and 1789-04 on ink analysis. It is proposed to use the likelihood ratio from the Bayes theorem to interpret ink evidence. Unfortunately, when considering the analytical practices, as defined in the ASTM standards on ink analysis, it appears that current ink analytical practices do not allow for the level of reproducibility and accuracy required by a probabilistic framework. Such framework relies on the evaluation of the statistics of the ink characteristics using an ink reference database and the objective measurement of similarities between ink samples. A complete research programme was designed to (a) develop a standard methodology for analysing ink samples in a more reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in a forensic context. This report focuses on the first of the three stages. A calibration process, based on a standard dye ladder, is proposed to improve the reproducibility of ink analysis by HPTLC, when these inks are analysed at different times and/or by different examiners. The impact of this process on the variability between the repetitive analyses of ink samples in various conditions is studied. The results show significant improvements in the reproducibility of ink analysis compared to traditional calibration methods.
Resumo:
In ''Nietzsche, Genealogy, History," Foucault suggests that genealogy is a sort of "curative science." The genealogist must be a physiologist and a pathologist as well as an historian, for his task is to decipher the marks that power relations and historical events leave on the subjugated body; "he must be able to diagnose the illnesses of the body, its conditions of weakness and strength, its breakdowns and resistances, to be in a position to judge philosophical discourse." But this claim seems to be incongruent with another major task of genealogy. After all, genealogy is supposed to show us that the things we take to be absolute are in fact discontinuous and historically situated: "Nothing in man-not even his body-is sufficiently stable to serve as the basis for self-recognition or for understanding other men." If this is true, then the subjugated body can never be restored to a healthy state because it has no essential or original nature. There are no universal standards by which we can even distinguish between healthy and unhealthy bodies. So in what sense is genealogy to be a "curative science"? In my thesis, I try to elucidate the complex relationship between genealogy and the body. I argue that genealogy can be a curative science even while it "multiplies our body and sets it against itself." Ifwe place a special emphasis on the role that transgression plays in Foucault's genealogical works, then the healthy body is precisely the body that resists universal standards and classifications. If genealogy is to be a curative science, then it must restore to the subjugated body an "identity" that transgresses its own limits and that constitutes itself, paradoxically, in the very effacement of identity. In the first chapter of my thesis, I examine the body's role as "surface of the inscription of events." Power relations inscribe on and around the body an identity or subjectivity that appears to be unified and universal, but which is in fact disparate and historically situated. The "subjected" body is the sick and pathologically weak body. In Chapters 2 and 3, I describe how it is possible for the unhealthy body to become healthy by resisting the subjectivity that has been inscribed upon it. Chapter 4 explains how Foucault's later works fit into this characterization of genealogy
On Implementing Joins, Aggregates and Universal Quantifier in Temporal Databases using SQL Standards
Resumo:
A feasible way of implementing a temporal database is by mapping temporal data model onto a conventional data model followed by a commercial database management system. Even though extensions were proposed to standard SQL for supporting temporal databases, such proposals have not yet come across standardization processes. This paper attempts to implement database operators such as aggregates and universal quantifier for temporal databases, implemented on top of relational database systems, using currently available SQL standards.
Resumo:
This work presents a triple-mode sigma-delta modulator for three wireless standards namely GSM/WCDMA and Bluetooth. A reconfigurable ADC has been used to meet the wide bandwidth and high dynamic range requirements of the multi-standard receivers with less power consumption. A highly linear sigma-delta ADC which has reduced sensitivity to circuit imperfections has been chosen in our design. This is particularly suitable for wide band applications where the oversampling ratio is low. Simulation results indicate that the modulator achieves a peak SNDR of 84/68/68 dB over a bandwidth of 0.2/3.84/1.5 MHz with an oversampling ratio 128/8/8 in GSM/WCDMA/Bluetooth modes respectively
Resumo:
The article examines the commodity chain trap of marine fishery in Kerala, at both material and value terms, and its ramifications in the globalised fishery chains. The marketing chains both material and value, are very complex in nature since they involve many types of markets and large number of intermediaries and participants. The article also scrutinizes the sensitivity of consumers’ and country’s responses in terms of dietary and hygienic standards relating to seafood trade. In addition, it discusses the devastating effect about the recent stipulations like the US Bio- Terrorism Act and Shrimp anti-dumping duty on the Kerala fishery products
Resumo:
Each player in the financial industry, each bank, stock exchange, government agency, or insurance company operates its own financial information system or systems. By its very nature, financial information, like the money that it represents, changes hands. Therefore the interoperation of financial information systems is the cornerstone of the financial services they support. E-services frameworks such as web services are an unprecedented opportunity for the flexible interoperation of financial systems. Naturally the critical economic role and the complexity of financial information led to the development of various standards. Yet standards alone are not the panacea: different groups of players use different standards or different interpretations of the same standard. We believe that the solution lies in the convergence of flexible E-services such as web-services and semantically rich meta-data as promised by the semantic Web; then a mediation architecture can be used for the documentation, identification, and resolution of semantic conflicts arising from the interoperation of heterogeneous financial services. In this paper we illustrate the nature of the problem in the Electronic Bill Presentment and Payment (EBPP) industry and the viability of the solution we propose. We describe and analyze the integration of services using four different formats: the IFX, OFX and SWIFT standards, and an example proprietary format. To accomplish this integration we use the COntext INterchange (COIN) framework. The COIN architecture leverages a model of sources and receivers’ contexts in reference to a rich domain model or ontology for the description and resolution of semantic heterogeneity.
Resumo:
The semantic web represents a current research effort to increase the capability of machines to make sense of content on the web. In this class, Peter Scheir will give a guest lecture on the basic principles underlying the semantic web vision, including RDF, OWL and other standards.
Resumo:
In a vault on the outskirts of Paris, a cylinder of platinum-iridium sits in a safe under three layers of glass. It is the kilogram, kept by the Bureau International des Poids et Mesures (BIPM), which is the international home of metrology. Metrology is the science of measurement, and it is of fundamental importance to us all. It is essential for trade, commerce, navigation, transport, communication, surveying, engineering, and construction. It is essential for medical diagnosis and treatment, health and safety, food and consumer protection, and for preserving the environment—e.g., measuring ozone in the atmosphere. Many of these applications are of particular relevance to chemistry and thus to IUPAC. In all these activities we need to make measurements reliably—to an appropriate and known level of uncertainty. The financial implications of metrology are enormous. In the United States, for example, some 15% of the gross domestic product is spent on healthcare, involving reliable quantitative measurements for both diagnosis and treatment.