873 resultados para place-based pedagogy
Resumo:
The development of renewable energy sources and Distributed Generation (DG) of electricity is of main importance in the way towards a sustainable development. However, the management, in large scale, of these technologies is complicated because of the intermittency of primary resources (wind, sunshine, etc.) and small scale of some plants. The aggregation of DG plants gives place to a new concept: the Virtual Power Producer (VPP). VPPs can reinforce the importance of these generation technologies making them valuable in electricity markets. VPPs can ensure a secure, environmentally friendly generation and optimal management of heat, electricity and cold as well as optimal operation and maintenance of electrical equipment, including the sale of electricity in the energy market. For attaining these goals, there are important issues to deal with, such as reserve management strategies, strategies for bids formulation, the producers’ remuneration, and the producers’ characterization for coalition formation. This chapter presents the most important concepts related with renewable-based generation integration in electricity markets, using VPP paradigm. The presented case studies make use of two main computer applications:ViProd and MASCEM. ViProd simulates VPP operation, including the management of plants in operation. MASCEM is a multi-agent based electricity market simulator that supports the inclusion of VPPs in the players set.
Resumo:
Introduction: Hearing loss h sone raised impact in the development and academic progress of a child. In several developed countries, early detection is part of the national health plan through universal neonatal hearing screening (UNHS) and also with school hearing screening programs (SHSP), but only a few have published national data and revised protocols. Currently in Portugal, the UNHS is implemented in the main district hospitals but not the SHPS, as well we still do not make use of concrete data nor publication of studies on the national reality. Objectives: The incidence of the hearing loss and of otological problems was studied in school communities in the north of the country with 2550 participants between 3 and 17 years old. Methods: Statistical data collected within the schools with a standard auditory hearing screening protocol. All participants were evaluated with the same protocol, an audiological anamnesis, otoscopy and audiometric exam screening (500, 1000, 2000 and 4000 Hz) were fulfilled. Results: Different otological problems were identified and the audiometric screening exam counted auditory thresholds that outpointed uni and bilateral hearing loss in about 5.7% of the cases. Conclusions: The study has demonstrated that auditory school screening should take place as early as possible and be part of the primary health care to identify and direct children to appropriate rehabilitation, education and attendance. Thus, reducing high costs with late treatment.
Resumo:
Engineering Education includes not only teaching theoretical fundamental concepts but also its verification during practical lessons in laboratories. The usual strategies to carry out this action are frequently based on Problem Based Learning, starting from a given state and proceeding forward to a target state. The possibility or the effectiveness of this procedure depends on previous states and if the present state was caused or resulted from earlier ones. This often happens in engineering education when the achieved results do not match the desired ones, e.g. when programming code is being developed or when the cause of the wrong behavior of an electronic circuit is being identified. It is thus important to also prepare students to proceed in the reverse way, i.e. given a start state generate the explanation or even the principles that underlie it. Later on, this sort of skills will be important. For instance, to a doctor making a patient?s story or to an engineer discovering the source of a malfunction. This learning methodology presents pedagogical advantages besides the enhanced preparation of students to their future work. The work presented on his document describes an automation project developed by a group of students in an engineering polytechnic school laboratory. The main objective was to improve the performance of a Braille machine. However, in a scenario of Reverse Problem-Based learning, students had first to discover and characterize the entire machine's function before being allowed (and being able) to propose a solution for the existing problem.
Resumo:
Relatório da Prática Profissional Supervisionada Mestrado em Educação Pré-Escolar
Resumo:
Concepts like E-learning and M-learning are changing the traditional learning place. No longer restricted to well-defined physical places, education on Automation and other Engineering areas is entering the so-called ubiquitous learning place, where even the more practical knowledge (acquired at lab classes) is now moving into, due to emergent concepts such as Remote Experimentation or Mobile Experimentation. While Remote Experimentation is traditionally regarded as the remote access to real-world experiments through a simple web browser running on a PC connected to the Internet, Mobile Experimentation may be seen as the access to those same (or others) experiments, through mobile devices, used in M-learning contexts. These two distinct client types (PCs versus mobile devices) pose specific requirements for the remote lab infrastructure, namely the ability to tune the experiment interface according to the characteristics (e.g. display size) of the accessing device. This paper addresses those requirements, namely by proposing a new architecture for the remote lab infrastructure able to accommodate both Remote and Mobile Experimentation scenarios.
Resumo:
Video poker machines, a former symbol of fraud and gambling in Brazil, are now being converted into computer-based educational tools for Brazilian public primary schools and also for governmental and non-governmental institutions dealing with communities of poverty and social exclusion, in an attempt to reduce poverty risks (decrease money spent on gambling) and promote social inclusion (increase access and motivation to education). Thousands of illegal gambling machines are seized by federal authorities, in Brazil, every year, and usually destroyed at the end of the criminal apprehension process. This paper describes a project developed by the University of Southern Santa Catarina, Brazil, responsible for the conversion process of gambling machines, and the social inclusion opportunities derived from it. All project members worked on a volunteer basis, seeking to promote social inclusion of Brazilian young boys and girls, namely through digital inclusion. So far, the project has been able to convert over 200 gambling machines and install them in over 40 public primary schools, thus directly benefiting more than 12,000 schoolchildren. The initial motivation behind this project was technology based, however the different options arising from the conversion process of the gambling machines have also motivated a rather innovative and unique experience in allowing schoolchildren and young people with special (educational) needs to access to computer-based pedagogical applications. The availability of these converted machines also helps to place Information and Communication Technologies (ICT) in the very daily educational environment of these children and youngsters, thus serving social and cultural inclusion aspects, by establishing a dialogue with the community and their technological expectations, and also directly contributing to their digital literacy.
Resumo:
Between October 1988 and April 1989 a cross-sectional survey was carried out in six out of eight blood banks of Goiânia, Central Brazil. Subjects attending for first-time blood donation in the mornings of the study period (n = 1358) were interviewed and screened for T. cruzi infection as a part of a major study among blood donors. Tests to anti-T. cruzi antibodies were performed, simultaneously, by indirect hem agglutination test (IHA) and complement fixation test (CFT). A subject was considered seropositive when any one of the two tests showed a positive result. Information on age, sex, place of birth, migration and socio-economic level was recorded. Results from this survey were compared with seroprevalence rates obtained in previous studies in an attempt to analyse trend of T. cruzi infection in an endemic urban area. The overall seroprevalence of T. cruzi infection among first-time donors was found to be 3.5% (95% confidence interval 2.5%-4.5% ). The seroprevalence rate increased with age up to 45 years and then decreased. Migrants from rural areas had higher seroprevalence rates than subjects from urban counties (1.8%-16.2% vs. 0%-3.6%). A four fold decrease in prevalence rates was observed when these rates were compared with those of fifteen years ago. Two possible hypotheses to explain this difference were suggested: 1. a cohort effect related with the decrease of transmission in rural areas and/or 2. a differential proportion of people of rural origin among blood donors between the two periods. The potential usefulness of blood banks as a source of epidemiological information to monitor trends of T. cruzi infection in an urban adult population was stressed.
Resumo:
Resumo I (Prática Pedagógica) - O Relatório de estágio foi concebido no âmbito da Unidade Curricular de Estágio do Ensino Especializado, Mestrado em Ensino da Música pela Escola Superior de Música de Lisboa. Assim, este documento assenta sobre a prática pedagógica desenvolvida no Conservatório de Música David de Sousa – Polo Pombal no ano letivo 2014-2015, abrangendo três alunos de diferentes graus de ensino. Neste Relatório será caracterizado o estabelecimento de ensino onde decorreu o estágio, assim como o desempenho que cada aluno teve durante o ano letivo, salientando os aspetos de competência motora, auditiva e expressiva. Este trabalho consistiu na avaliação do meu desempenho enquanto docente de trompete, permitindo-me refletir sobre os pontos bons e menos bons do meu trabalho, para que no futuro me seja possível atingir um nível mais elevado na minha atividade docente.
Resumo:
Background: Cardiovascular diseases and other non-communicable diseases are major causes of morbidity and mortality, responsible for 38 million deaths in 2012, 75 % occurring in low- and middle-income countries. Most of these countries are facing a period of epidemiological transition, being confronted with an increased burden of non-communicable diseases, which challenge health systems mainly designed to deal with infectious diseases. With the adoption of the World Health Organization “Global Action Plan for the Prevention and Control of non-communicable diseases, 2013–2020”, the national dimension of risk factors for non-communicable diseases must be reported on a regular basis. Angola has no national surveillance system for non-communicable diseases, and periodic population-based studies can help to overcome this lack of information. CardioBengo will collect information on risk factors, awareness rates and prevalence of symptoms relevant to cardiovascular diseases, to assist decision makers in the implementation of prevention and treatment policies and programs. Methods: CardioBengo is designed as a research structure that comprises a cross-sectional component, providing baseline information and the assembling of a cohort to follow-up the dynamics of cardiovascular diseases risk factors in the catchment area of the Dande Health and Demographic Surveillance System of the Health Research Centre of Angola, in Bengo Province, Angola. The World Health Organization STEPwise approach to surveillance questionnaires and procedures will be used to collect information on a representative sex-age stratified sample, aged between 15 and 64 years old. Discussion: CardioBengo will recruit the first population cohort in Angola designed to evaluate cardiovascular diseases risk factors. Using the structures in place of the Dande Health and Demographic Surveillance System and a reliable methodology that generates comparable results with other regions and countries, this study will constitute a useful tool for the surveillance of cardiovascular diseases. Like all longitudinal studies, a strong concern exists regarding dropouts, but strategies like regular visits to selected participants and a strong community involvement are in place to minimize these occurrences.
Resumo:
Dissertação apresentada para obtenção do Grau de Doutor em Engenharia Electrotécnica, Especialidade de Sistemas Digitais, pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia
Resumo:
Hyperspectral remote sensing exploits the electromagnetic scattering patterns of the different materials at specific wavelengths [2, 3]. Hyperspectral sensors have been developed to sample the scattered portion of the electromagnetic spectrum extending from the visible region through the near-infrared and mid-infrared, in hundreds of narrow contiguous bands [4, 5]. The number and variety of potential civilian and military applications of hyperspectral remote sensing is enormous [6, 7]. Very often, the resolution cell corresponding to a single pixel in an image contains several substances (endmembers) [4]. In this situation, the scattered energy is a mixing of the endmember spectra. A challenging task underlying many hyperspectral imagery applications is then decomposing a mixed pixel into a collection of reflectance spectra, called endmember signatures, and the corresponding abundance fractions [8–10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. Linear mixing model holds approximately when the mixing scale is macroscopic [13] and there is negligible interaction among distinct endmembers [3, 14]. If, however, the mixing scale is microscopic (or intimate mixtures) [15, 16] and the incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [17], the linear model is no longer accurate. Linear spectral unmixing has been intensively researched in the last years [9, 10, 12, 18–21]. It considers that a mixed pixel is a linear combination of endmember signatures weighted by the correspondent abundance fractions. Under this model, and assuming that the number of substances and their reflectance spectra are known, hyperspectral unmixing is a linear problem for which many solutions have been proposed (e.g., maximum likelihood estimation [8], spectral signature matching [22], spectral angle mapper [23], subspace projection methods [24,25], and constrained least squares [26]). In most cases, the number of substances and their reflectances are not known and, then, hyperspectral unmixing falls into the class of blind source separation problems [27]. Independent component analysis (ICA) has recently been proposed as a tool to blindly unmix hyperspectral data [28–31]. ICA is based on the assumption of mutually independent sources (abundance fractions), which is not the case of hyperspectral data, since the sum of abundance fractions is constant, implying statistical dependence among them. This dependence compromises ICA applicability to hyperspectral images as shown in Refs. [21, 32]. In fact, ICA finds the endmember signatures by multiplying the spectral vectors with an unmixing matrix, which minimizes the mutual information among sources. If sources are independent, ICA provides the correct unmixing, since the minimum of the mutual information is obtained only when sources are independent. This is no longer true for dependent abundance fractions. Nevertheless, some endmembers may be approximately unmixed. These aspects are addressed in Ref. [33]. Under the linear mixing model, the observations from a scene are in a simplex whose vertices correspond to the endmembers. Several approaches [34–36] have exploited this geometric feature of hyperspectral mixtures [35]. Minimum volume transform (MVT) algorithm [36] determines the simplex of minimum volume containing the data. The method presented in Ref. [37] is also of MVT type but, by introducing the notion of bundles, it takes into account the endmember variability usually present in hyperspectral mixtures. The MVT type approaches are complex from the computational point of view. Usually, these algorithms find in the first place the convex hull defined by the observed data and then fit a minimum volume simplex to it. For example, the gift wrapping algorithm [38] computes the convex hull of n data points in a d-dimensional space with a computational complexity of O(nbd=2cþ1), where bxc is the highest integer lower or equal than x and n is the number of samples. The complexity of the method presented in Ref. [37] is even higher, since the temperature of the simulated annealing algorithm used shall follow a log( ) law [39] to assure convergence (in probability) to the desired solution. Aiming at a lower computational complexity, some algorithms such as the pixel purity index (PPI) [35] and the N-FINDR [40] still find the minimum volume simplex containing the data cloud, but they assume the presence of at least one pure pixel of each endmember in the data. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. PPI algorithm uses the minimum noise fraction (MNF) [41] as a preprocessing step to reduce dimensionality and to improve the signal-to-noise ratio (SNR). The algorithm then projects every spectral vector onto skewers (large number of random vectors) [35, 42,43]. The points corresponding to extremes, for each skewer direction, are stored. A cumulative account records the number of times each pixel (i.e., a given spectral vector) is found to be an extreme. The pixels with the highest scores are the purest ones. N-FINDR algorithm [40] is based on the fact that in p spectral dimensions, the p-volume defined by a simplex formed by the purest pixels is larger than any other volume defined by any other combination of pixels. This algorithm finds the set of pixels defining the largest volume by inflating a simplex inside the data. ORA SIS [44, 45] is a hyperspectral framework developed by the U.S. Naval Research Laboratory consisting of several algorithms organized in six modules: exemplar selector, adaptative learner, demixer, knowledge base or spectral library, and spatial postrocessor. The first step consists in flat-fielding the spectra. Next, the exemplar selection module is used to select spectral vectors that best represent the smaller convex cone containing the data. The other pixels are rejected when the spectral angle distance (SAD) is less than a given thresh old. The procedure finds the basis for a subspace of a lower dimension using a modified Gram–Schmidt orthogonalizati on. The selected vectors are then projected onto this subspace and a simplex is found by an MV T pro cess. ORA SIS is oriented to real-time target detection from uncrewed air vehicles using hyperspectral data [46]. In this chapter we develop a new algorithm to unmix linear mixtures of endmember spectra. First, the algorithm determines the number of endmembers and the signal subspace using a newly developed concept [47, 48]. Second, the algorithm extracts the most pure pixels present in the data. Unlike other methods, this algorithm is completely automatic and unsupervised. To estimate the number of endmembers and the signal subspace in hyperspectral linear mixtures, the proposed scheme begins by estimating sign al and noise correlation matrices. The latter is based on multiple regression theory. The signal subspace is then identified by selectin g the set of signal eigenvalue s that best represents the data, in the least-square sense [48,49 ], we note, however, that VCA works with projected and with unprojected data. The extraction of the end members exploits two facts: (1) the endmembers are the vertices of a simplex and (2) the affine transformation of a simplex is also a simplex. As PPI and N-FIND R algorithms, VCA also assumes the presence of pure pixels in the data. The algorithm iteratively projects data on to a direction orthogonal to the subspace spanned by the endmembers already determined. The new end member signature corresponds to the extreme of the projection. The algorithm iterates until all end members are exhausted. VCA performs much better than PPI and better than or comparable to N-FI NDR; yet it has a computational complexity between on e and two orders of magnitude lower than N-FINDR. The chapter is structure d as follows. Section 19.2 describes the fundamentals of the proposed method. Section 19.3 and Section 19.4 evaluate the proposed algorithm using simulated and real data, respectively. Section 19.5 presents some concluding remarks.
Resumo:
Power systems have been experiencing huge changes mainly due to the substantial increase of distributed generation (DG) and the operation in competitive environments. Virtual Power Players (VPP) can aggregate several players, namely a diversity of energy resources, including distributed generation (DG) based on several technologies, electric storage systems (ESS) and demand response (DR). Energy resources management gains an increasing relevance in this competitive context. This makes the DR use more interesting and flexible, giving place to a wide range of new opportunities. This paper proposes a methodology to support VPPs in the DR programs’ management, considering all the existing energy resources (generation and storage units) and the distribution network. The proposed method is based on locational marginal prices (LMP) values. The evaluation of the impact of using DR specific programs in the LMP values supports the manager decision concerning the DR use. The proposed method has been computationally implemented and its application is illustrated in this paper using a 33-bus network with intensive use of DG.
Resumo:
This chapter appears in Encyclopaedia of Human Resources Information Systems: Challenges in e-HRM edited by Torres-Coronas, T. and Arias-Oliva, M. Copyright 2009, IGI Global, www.igi-global.com. Posted by permission of the publisher. URL:http://www.igi-pub.com/reference/details.asp?id=7737
Resumo:
Tendo em conta um enfoque comunicativo experiencial (Fernández- Corbacho, 2014) e uma pedagogia crítica emancipatória (Jiménez Raya, Lamb & Vieira, 2007), enriquecida por enfoques multissensoriais (Arslan, 2009), é nossa intenção, com este projeto, contribuir para a implementação de práticas que espelhem as variedades linguísticas e culturais da Hispanoamérica (Liceras, 1995; Beave, 2000) na aula de espanhol como língua estrangeira no ensino secundário português. Neste estudo, através duma perspetiva metodológica de índole qualitativa, pretendemos, como ponto de partida, analisar: a) as representações de alunos portugueses sobre o lugar da Hispanoamérica no processo de ensino-aprendizagem de espanhol como língua estrangeira (Altmann & Vences, 2004; Pérez, 2003), através de inquéritos por questionário; e, ainda, b) as abordagens das variedades linguísticas e culturais do espanhol, que surgem nos manuais utilizados no ensino secundário português. Por outro lado, através de um estudo de caso (Benson, Chik, Gao, Huang & Wang, 2009), procurámos evidenciar uma mostra de possíveis boas práticas didático-pedagógicas e materiais, com vista a um trabalho sistemático e próativo com as variedades linguísticas e culturais do espanhol, baseado numa (hiper)pedagogia crítica e encarando a língua enquanto objeto manipulável e potenciador de cidadãos verdadeiramente conscientes do mundo. Para tal, criámos materiais físicos e digitais, que foram posteriormente implementados com alunos do 11º ano de escolaridade, no nível de iniciação de espanhol, num agrupamento de escolas da região de Aveiro. Os resultados mostram que práticas e materiais desta natureza poderão favorecer aprendizagens comunicativas experienciais, quanto à criação de futuros cidadãos críticos e ativos, fomentando o desenvolvimento das suas competências comunicativa plurilingue e pluricultural e duma consciência cultural crítica (Byram, Gribkova & Starkey, 2002) dos alunos, no contexto de ensino-aprendizagem do ensino secundário.
Resumo:
Scientific literature has strengthened the perpetuation of inequality factors in the labour market based on gender, despite the on-going endeavour of various political bodies and legal norms against the vertical and horizontal segregation of women. National and European statistical data shows the relevance and timeless features of theories of market segmentation associated with the labour market dating back to the 70’s of the 20th century. Hence, the European Community considers as a priority in the Europe 2020 strategy, the definition of “policies to promote gender equality […] to increase labour force participation thus adding to growth and social cohesion”. If we consider that on the one hand, social economy is fairly recognised to be equated with market actors and the State for its economic and social role in tackling the current crisis, and on the other hand, that the ideals of the sector, systematised in the “Framework Law of Social Economy” (Law no. 30/2013 8th of May), particularly in article 5 proposing “the respect for the values […] of equality and non-discrimination […], justice and equity […]”, we aim to reflect on indicators that uncover a vertical and horizontal segregation in the labour market. Departing from a mixed methodological approach (extensive and intensive), subject to the topic of "Social Entrepreneurship in Portugal" in social economy organisations, we detect very high rates of employment feminisation, with a ratio of 1 man (23%) for every 3 women (77%). Women are mainly earmarked for technical and operational activities, arising from the privileged intervention areas, namely education, training, health, elderly, families, poverty, ultimately being underrepresented in statutory boards and, as such, far removed from deliberations and strategic resolutions. This is particularly visible in the existing hierarchy of functions and management practices of the responsibility of male members. Thus, it seems easily verified that the sector is travelling away from the ideals of justice and social equity, which can crystallise the "non-place" of women in the definition of a strategic direction of social economy and in the most invisible/private “place” of the organisational setting.