931 resultados para Service Programming Environment
Resumo:
Total particulate matter (TPM) was passively collected inside two classrooms of each of five elementary schools in Lisbon, Portugal. TPM was collected in polycarbonate filters with a 47 mm diameter, placed inside of uncovered plastic petri dishes. The sampling period was from 19 May to 22 June 2009 (35 days exposure) and the collected TPM masses varied between 0.2 mg and 0.8 mg. The major elements were Ca, Fe, Na, K, and Zn at μg level, while others were at ng level. Pearson′s correlation coefficients above 0.75 (a high degree of correlation) were found between several elements. Soil-related, traffic soil re-suspension and anthropogenic emission sources could be identified. Blackboard chalk was also identified through Ca large presence. Some of the determined chemical elements are potential carcinogenic. Quality control of the results showed good agreement as confirmed by the application of u-score test.
Resumo:
O presente projecto tem como objectivo a disponibilização de uma plataforma de serviços para gestão e contabilização de tempo remunerável, através da marcação de horas de trabalho, férias e faltas (com ou sem justificação). Pretende-se a disponibilização de relatórios com base nesta informação e a possibilidade de análise automática dos dados, como por exemplo excesso de faltas e férias sobrepostas de trabalhadores. A ênfase do projecto está na disponibilização de uma arquitectura que facilite a inclusão destas funcionalidades. O projecto está implementado sobre a plataforma Google App Engine (i.e. GAE), de forma a disponibilizar uma solução sob o paradigma de Software as a Service, com garantia de disponibilidade e replicação de dados. A plataforma foi escolhida a partir da análise das principais plataformas cloud existentes: Google App Engine, Windows Azure e Amazon Web Services. Foram analisadas as características de cada plataforma, nomeadamente os modelos de programação, os modelos de dados disponibilizados, os serviços existentes e respectivos custos. A escolha da plataforma foi realizada com base nas suas características à data de iniciação do presente projecto. A solução está estruturada em camadas, com as seguintes componentes: interface da plataforma, lógica de negócio e lógica de acesso a dados. A interface disponibilizada está concebida com observação dos princípios arquitecturais REST, suportando dados nos formatos JSON e XML. A esta arquitectura base foi acrescentada uma componente de autorização, suportada em Spring-Security, sendo a autenticação delegada para os serviços Google Acounts. De forma a permitir o desacoplamento entre as várias camadas foi utilizado o padrão Dependency Injection. A utilização deste padrão reduz a dependência das tecnologias utilizadas nas diversas camadas. Foi implementado um protótipo, para a demonstração do trabalho realizado, que permite interagir com as funcionalidades do serviço implementadas, via pedidos AJAX. Neste protótipo tirou-se partido de várias bibliotecas javascript e padrões que simplificaram a sua realização, tal como o model-view-viewmodel através de data binding. Para dar suporte ao desenvolvimento do projecto foi adoptada uma abordagem de desenvolvimento ágil, baseada em Scrum, de forma a implementar os requisitos do sistema, expressos em user stories. De forma a garantir a qualidade da implementação do serviço foram realizados testes unitários, sendo também feita previamente a análise da funcionalidade e posteriormente produzida a documentação recorrendo a diagramas UML.
Resumo:
O presente trabalho teve como principal objectivo o desenvolvimento de um analisador de vibrações de dois canais baseado em computador, para a realização de diagnóstico no âmbito do controlo de condição de máquinas. Foi desenvolvida uma aplicação num computador comum, no software LabVIEW, que através de transdutores de aceleração do tipo MEMS conectados via USB, faz a recolha de dados de vibração e procede ao seu processamento e apresentação ao utilizador. As ferramentas utilizadas para o processamento de dados são ferramentas comuns encontradas em vários analisadores de vibrações disponíveis no mercado. Estas podem ser: gráficos de espectro de frequência, sinal no tempo, cascata ou valores de nível global de vibração, entre outras. Apesar do analisador desenvolvido não apresentar inovação nas ferramentas de análise adoptadas, este pretende ser distinguido pelo baixo custo, simplicidade e carácter didáctico. Este trabalho vem evidenciar as vantagens, desvantagens e potencialidades de um analisador desta natureza. São tiradas algumas conclusões quanto à sua capacidade de diagnóstico de avarias, capacidades como ferramenta didáctica, sensores utilizados e linguagem de programação escolhida. Como conclusões principais, o trabalho revela que os sensores escolhidos não são os indicados para efectuar o diagnóstico de avarias em ambiente industrial, contudo são ideais para tornar este analisador numa boa ferramenta didáctica e de treino.
Resumo:
This paper is on the problem of short-term hydro, scheduling, particularly concerning head-dependent cascaded hydro systems. We propose a novel mixed-integer quadratic programming approach, considering not only head-dependency, but also discontinuous operating regions and discharge ramping constraints. Thus, an enhanced short-term hydro scheduling is provided due to the more realistic modeling presented in this paper. Numerical results from two case studies, based on Portuguese cascaded hydro systems, illustrate the proficiency of the proposed approach.
Resumo:
This paper is on the unit commitment problem, considering not only the economic perspective, but also the environmental perspective. We propose a bi-objective approach to handle the problem with conflicting profit and emission objectives. Numerical results based on the standard IEEE 30-bus test system illustrate the proficiency of the proposed approach.
Resumo:
As teachers, we are challenged everyday to solve pedagogical problems and we have to fight for our students’ attention in a media rich world. I will talk about how we use ICT in Initial Teacher Training and give you some insight on what we are doing. The most important benefit of using ICT in education is that it makes us reflect on our practice. There is no doubt that our classrooms need to be updated, but we need to be critical about every peace of hardware, software or service that we bring into them. It is not only because our budgets are short, but also because e‐learning is primarily about learning, not technology. Therefore, we need to have the knowledge and skills required to act in different situations, and choose the best tool for the job. Not all subjects are suitable for e‐learning, nor do all students have the skills to organize themselves their own study times. Also not all teachers want to spend time programming or learning about instructional design and metadata. The promised land of easy use of authoring tools (e.g. eXe and Reload) that will lead to all teachers become Learning Objects authors and share these LO in Repositories, all this failed, like previously HyperCard, Toolbook and others. We need to know a little bit of many different technologies so we can mobilize this knowledge when a situation requires it: integrate e‐learning technologies in the classroom, not a flipped classroom, just simple tools. Lecture capture, mobile phones and smartphones, pocket size camcorders, VoIP, VLE, live video broadcast, screen sharing, free services for collaborative work, save, share and sync your files. Do not feel stressed to use everything, every time. Just because we have a whiteboard does not mean we have to make it the centre of the classroom. Start from where you are, with your preferred subject and the tools you master. Them go slowly and try some new tool in a non‐formal situation and with just one or two students. And you don’t need to be alone: subscribe a mailing list and share your thoughts with other teachers in a dedicated forum, even better if both are part of a community of practice, and share resources. We did that for music teachers and it was a success, in two years arriving at 1.000 members. Just do it.
Resumo:
In this paper we describe a casestudy of an experiment on how reflexivity and technology can enhance learning, by using ePorfolios as a training environment to develop translation skills. Translation is today a multiskilled job and translators need to assure their clients a good performance and quality, both in language and in technology domains. In order to accomplish it, for the translator all the tasks and processes he develops appear as crucial, being pretranslation and posttranslation processes equally important as the translation itself, namely as far as autonomy, reflexive and critical skills are concerned. Finally, the need and relevance for collaborative tasks and networks amongst virtual translation communities, led us to the decision of implementing ePortfolios as a tool to develop the requested skills and extend the use of Internet in translation, namely in terminology management phases, for the completion of each task, by helping students in the management of the projects deadlines, improving their knowledge on the construction and management of translation resources and deepening their awareness about the concepts related to the development and usability of ePorfolios.
Resumo:
In the context of the Bologna Declaration a change is taking place in the teaching/learning paradigm. From teaching-centered education, which emphasizes the acquisition and transmission of knowledge, we now speak of learning-centered education, which is more demanding for students. This paradigm promotes a continuum of lifelong learning, where the individual needs to be able to handle knowledge, to select what is appropriate for a particular context, to learn permanently and to understand how to learn in new and rapidly changing situations. One attempt to face these challenges has been the experience of ISCAP regarding the teaching/learning of accounting in the course Managerial Simulation. This paper describes the process of teaching, learning and assessment in an action-based learning environment. After a brief general framework that focuses on education objectives, we report the strengths and limitations of this teaching/learning tool. We conclude with some lessons from the implementation of the project.
Resumo:
Mestrado em Intervenção Sócio-Organizacional na Saúde - Área de especialização: Políticas de Administração e Gestão de Serviços de Saúde.
Resumo:
Project LIHE: the Portuguese Case. ESREA Fourth Access Network Conference – “Equity, Access and Participation: Research, Policy and Practice”. Edinburgh (Scotland), 11 – 13 December, 2003.
Resumo:
The aim of this study was the assessment of exposure to ultrafine in the urban environment of Lisbon, Portugal, due to automobile traffic, and consisted of the determination of deposited alveolar surface area in an avenue leading to the town center during late spring. This study revealed differentiated patterns for weekdays and weekends, which could be related with the fluxes of automobile traffic. During a typical week, ultrafine particles alveolar deposited surface area varied between 35.0 and 89.2 μm2/cm3, which is comparable with levels reported for other towns such in Germany and the United States. These measurements were also complemented by measuring the electrical mobility diameter (varying from 18.3 to 128.3 nm) and number of particles that showed higher values than those previously reported for Madrid and Brisbane. Also, electron microscopy showed that the collected particles were composed of carbonaceous agglomerates, typical of particles emitted by the exhaustion of diesel vehicles. Implications: The approach of this study considers the measurement of surface deposited alveolar area of particles in the outdoor urban environment of Lisbon, Portugal. This type of measurements has not been done so far. Only particulate matter with aerodynamic diameters <2.5 (PM2.5) and >10 (PM10) μm have been measured in outdoor environments and the levels found cannot be found responsible for all the observed health effects. Therefore, the exposure to nano- and ultrafine particles has not been assessed systematically, and several authors consider this as a real knowledge gap and claim for data such as these that will allow for deriving better and more comprehensive epidemiologic studies. Nanoparticle surface area monitor (NSAM) equipments are recent ones and their use has been limited to indoor atmospheres. However, as this study shows, NSAM is a very powerful tool for outdoor environments also. As most lung diseases are, in fact, related to deposition of the alveolar region of the lung, the metric used in this study is the ideal one.
Resumo:
Mestrado em Intervenção Sócio-Organizacional na Saúde - Área de especialização: Políticas de Gestão e Administração dos Serviços de Saúde.
Resumo:
Mestrado em Intervenção Sócio-Organizacional na Saúde - Área de especialização: Políticas de Administração e Gestão de Serviços de Saúde.
Resumo:
OBJECTIVE: To identify factors that lead people to visit a doctor in Brazil and assess differences between socioeconomic groups. METHODS: A cross-sectional study comprising 1,260 subjects aged 15 or more was carried out in southern Brazil. Demographic, socioeconomic, health needs and regular source of care data were analyzed concerning visits to a doctor within two months from the interview. Adjusted prevalence ratios and 95% confidence intervals were calculated using Poisson regression. RESULTS: Adjusted PR showed that women having stressful life events, health insurance, and a regular doctor increased the outcome. A dose-related response was found with self-reported health, and the probability of visiting a doctor increased with health needs. Analysis in the chronic disease group revealed that uneducated lower income subjects had a 62% reduction in the chance of visiting a doctor compared to uneducated higher income ones. However, as it was seen a significant interaction between income and education, years of schooling increased utilization in this group. CONCLUSIONS: Results suggest the existence of health inequity in the poorest group that could be overcome with education. Specific measures reinforcing the importance of having a regular doctor may also improve access in the underserved group.
Resumo:
In recent years, power systems have experienced many changes in their paradigm. The introduction of new players in the management of distributed generation leads to the decentralization of control and decision-making, so that each player is able to play in the market environment. In the new context, it will be very relevant that aggregator players allow midsize, small and micro players to act in a competitive environment. In order to achieve their objectives, virtual power players and single players are required to optimize their energy resource management process. To achieve this, it is essential to have financial resources capable of providing access to appropriate decision support tools. As small players have difficulties in having access to such tools, it is necessary that these players can benefit from alternative methodologies to support their decisions. This paper presents a methodology, based on Artificial Neural Networks (ANN), and intended to support smaller players. In this case the present methodology uses a training set that is created using energy resource scheduling solutions obtained using a mixed-integer linear programming (MIP) approach as the reference optimization methodology. The trained network is used to obtain locational marginal prices in a distribution network. The main goal of the paper is to verify the accuracy of the ANN based approach. Moreover, the use of a single ANN is compared with the use of two or more ANN to forecast the locational marginal price.