8 resultados para 617.1026
em Repositório Científico do Instituto Politécnico de Lisboa - Portugal
Resumo:
Este estudo tem como objectivo validar um teste de avaliação da escrita com crianças do 2º ao 4º ano de escolaridade. Estabelecemos como hipóteses que: (1) inicialmente as crianças dominam as regras fonológicas básicas e mais tarde as regras contextuais e as morfológicas, das simples para as complexas, sendo possível criar um teste que avalie esse processo; (2) haverá uma relação entre os itens do teste e dois factores fundamentais: um factor fonológico e um factor morfológico. Foram abrangidas 610 crianças de 13 escolas dos distritos de Lisboa e Évora. As crianças responderam a um teste de escrita e foram classificadas pelos seus professores em função das suas competências. Os resultados do estudo revelam que aos 7 anos as crianças dominam a correspondência som-letra. Até aos 8.5 anos representam palavras de menor complexidade com base numa estratégia fonética. Revelam maior conhecimento das regras morfológicas entre os 9.5 – 10 anos. A análise da validade factorial dos itens do teste confirma que a escrita depende de duas fontes de conhecimento linguístico: a fonologia e a morfologia. Descrevem-se as abordagens desenvolvidas para a validação do teste: a validade de conteúdo, a validade de construção e a validade de critério através da correlação (a) entre os resultados obtidos no teste de escrita e num teste de leitura e (b) entre os resultados obtidos no teste de escrita e a avaliação global das competências ortográficas dos alunos por parte dos professores. Os coeficientes de correlação obtidos sugerem que as capacidades dos alunos na leitura e na escrita foram medidas de forma coerente e que o teste de escrita reflecte as competências ortográficas dos alunos nas actividades escolares.
Resumo:
Objectivos do estudo: a) avaliar o efeito mantido da terapêutica por CDI na qualidade de vida dos doentes com insuficiência cardíaca; b) avaliar a relação da esperança com a qualidade de vida.
Resumo:
Beaches worldwide provide recreational opportunities to hundreds of millions of people and serve as important components of coastal economies. Beach water is often monitored for microbiological quality to detect the presence of indicators of human sewage contamination so as to prevent public health outbreaks associated with water contact. However, growing evidence suggests that beach sand can harbor microbes harmful to human health, often in concentrations greater than the beach water. Currently, there are no standards for monitoring, sampling, analyzing, or managing beach sand quality. In addition to indicator microbes, growing evidence has identified pathogenic bacteria, viruses, and fungi in a variety of beach sands worldwide. The public health threat associated with these populations through direct and indirect contact is unknown because so little research has been conducted relating to health outcomes associated with sand quality. In this manuscript, we present the consensus findings of a workshop of experts convened in Lisbon, Portugal to discuss the current state of knowledge on beach sand microbiological quality and to develop suggestions for standardizing the evaluation of sand at coastal beaches. The expert group at the "Microareias 2012" workshop recommends that 1) beach sand should be screened for a variety of pathogens harmful to human health, and sand monitoring should then be initiated alongside regular water monitoring; 2) sampling and analysis protocols should be standardized to allow proper comparisons among beach locations; and 3) further studies are needed to estimate human health risk with exposure to contaminated beach sand. Much of the manuscript is focused on research specific to Portugal, but similar results have been found elsewhere, and the findings have worldwide implications.
Resumo:
Identificando a rutura no desenvolvimento da ciência entre o contexto de descoberta e o contexto de justificação, julgamos poder entender melhor a dominação do ensino transmissivo da ciência em contexto escolar e melhor argumentar para a sua superação. Por outro lado, o argumento que aqui defendemos para recuperar, para o contexto de sala de aula, a fortíssima chama cultural que a ciência transporta nos seus conceitos, leis e teorias, bem como no seu próprio processo de desenvolvimento, tem uma tradição enraizada na cultura portuguesa que exploramos.
Resumo:
An integration of undoped InOx and commercial ITO thin films into laboratory assembled light shutter devices is made. Accordingly, undoped transparent conductive InOx thin films, about 100 nm thick, are deposited by radiofrequency plasma enhanced reactive thermal evaporation (rf-PERTE) of indium teardrops with no intentional heating of the glass substrates. The process of deposition occurs at very low deposition rates (0.1-0.3 nm/s) to establish an optimized reaction between the oxygen plasma and the metal vapor. These films show the following main characteristics: transparency of 87% (wavelength, lambda = 632.8 nm) and sheet resistance of 52 Omega/sq; while on commercial ITO films the transparency was of 92% and sheet resistance of 83 Omega/sq. The InOx thin film surface characterized by AFM shows a uniform grain texture with a root mean square surface roughness of Rq similar to 2.276 nm. In contrast, commercial ITO topography is characterized by two regions: one smoother with Rq similar to 0.973 nm and one with big grains (Rq similar to 3.617 nm). For the shutters assembled using commercial ITO, the light transmission coefficient (Tr) reaches the highest value (Tr-max) of 89% and the lowest (Tr-min) of 1.3% [13], while for the InOx shutters these values are 80.1% and 3.2%, respectively. Regarding the electric field required to achieve 90% of the maximum transmission in the ON state (E-on), the one presented by the devices assembled with commercial ITO coated glasses is 2.41 V/mu m while the one presented by the devices assembled with InOx coated glasses is smaller, 1.77 V/mu m. These results corroborate the device quality that depends on the base materials and fabrication process used. (C) 2014 Elsevier Ltd. All rights reserved.
Resumo:
Behavioral biometrics is one of the areas with growing interest within the biosignal research community. A recent trend in the field is ECG-based biometrics, where electrocardiographic (ECG) signals are used as input to the biometric system. Previous work has shown this to be a promising trait, with the potential to serve as a good complement to other existing, and already more established modalities, due to its intrinsic characteristics. In this paper, we propose a system for ECG biometrics centered on signals acquired at the subject's hand. Our work is based on a previously developed custom, non-intrusive sensing apparatus for data acquisition at the hands, and involved the pre-processing of the ECG signals, and evaluation of two classification approaches targeted at real-time or near real-time applications. Preliminary results show that this system leads to competitive results both for authentication and identification, and further validate the potential of ECG signals as a complementary modality in the toolbox of the biometric system designer.
Resumo:
The main goals of the present work are the evaluation of the influence of several variables and test parameters on the melt flow index (MFI) of thermoplastics, and the determination of the uncertainty associated with the measurements. To evaluate the influence of test parameters on the measurement of MFI the design of experiments (DOE) approach has been used. The uncertainty has been calculated using a "bottom-up" approach given in the "Guide to the Expression of the Uncertainty of Measurement" (GUM). Since an analytical expression relating the output response (MFI) with input parameters does not exist, it has been necessary to build mathematical models by adjusting the experimental observations of the response variable in accordance with each input parameter. Subsequently, the determination of the uncertainty associated with the measurement of MFI has been performed by applying the law of propagation of uncertainty to the values of uncertainty of the input parameters. Finally, the activation energy (Ea) of the melt flow at around 200 degrees C and the respective uncertainty have also been determined.
Resumo:
This paper introduces a new hyperspectral unmixing method called Dependent Component Analysis (DECA). This method decomposes a hyperspectral image into a collection of reflectance (or radiance) spectra of the materials present in the scene (endmember signatures) and the corresponding abundance fractions at each pixel. DECA models the abundance fractions as mixtures of Dirichlet densities, thus enforcing the constraints on abundance fractions imposed by the acquisition process, namely non-negativity and constant sum. The mixing matrix is inferred by a generalized expectation-maximization (GEM) type algorithm. This method overcomes the limitations of unmixing methods based on Independent Component Analysis (ICA) and on geometrical based approaches. DECA performance is illustrated using simulated and real data.