14 resultados para Control the use of pesticides
em Instituto Politécnico do Porto, Portugal
Resumo:
In this abstract is presented an energy management system included in a SCADA system existent in a intelligent home. The system control the home energy resources according to the players definitions (electricity consumption and comfort levels), the electricity prices variation in real time mode and the DR events proposed by the aggregators.
Resumo:
In this paper we describe a casestudy of an experiment on how reflexivity and technology can enhance learning, by using ePorfolios as a training environment to develop translation skills. Translation is today a multiskilled job and translators need to assure their clients a good performance and quality, both in language and in technology domains. In order to accomplish it, for the translator all the tasks and processes he develops appear as crucial, being pretranslation and posttranslation processes equally important as the translation itself, namely as far as autonomy, reflexive and critical skills are concerned. Finally, the need and relevance for collaborative tasks and networks amongst virtual translation communities, led us to the decision of implementing ePortfolios as a tool to develop the requested skills and extend the use of Internet in translation, namely in terminology management phases, for the completion of each task, by helping students in the management of the projects deadlines, improving their knowledge on the construction and management of translation resources and deepening their awareness about the concepts related to the development and usability of ePorfolios.
Resumo:
This paper will focus on some aspects of translation based on blending distinct linguistic domains such as English Language and Portuguese in using false friends in the English class in tertiary level students, reflecting namely on: 1. the choice of a word suitable to the context in L2 ; 2. the difficulties encountered by choice of that word that could be misleading, by relying in a false L1 reality that is going to adulterate reality in the L2 domain; 3. the difficulty in making such type of distinctions due to the lack of linguistic and lexical knowledge. 4. the need to study the cause of these difficulties by working, not only with their peers, but also with their language teacher to develop strategies to diminish and if possible to eradicate this type of linguistic and, above all, translation problem by making an inventory of those types of mistakes. In relation to the first point it is necessary to know that translation tasks involve much more than literal concepts ( Ladmiral, 1975) : furthermore it is necessary and suitable to realise that lexicon relies in significant contexts (Coseriu 1966), which connects both domains, that, at first sight do not seem to be compatible. In other words, although students have the impression they dominate lexicon due to the fact that they possess at least seven years of foreign language exposure that doesn’t mean they master the particularities engaged in such a delicate task as translation is concerned. There are some chromaticisms in the words (false friends), that need to be researched and analysed later on by both students and language teachers. The reason for such state of affairs lies in their academic formation, of a mainly general stream, which has enabled them only for knowledge of the foreign language, but not for the translation as a tool as it is required only when they reach the tertiary level. Besides, for their translations they rely, most of the times, on glossaries, whose dominant language is portuguese of Brazil, which is, obviously, much different from the portuguese mother tongue reality and even more of English. So it seems necessary to use with caution the working tools (glossaries) that work as surpluses, but could bring translation problems as we will see.
Resumo:
Mestrado em Engenharia Electrotécnica e de Computadores
Resumo:
Introduction: Although relative uptake values aren’t the most important objective of a 99mTc-DMSA scan, they are important quantitative information. In most of the dynamic renal scintigraphies attenuation correction is essential if one wants to obtain a reliable result of the quantification process. Although in DMSA scans the absent of significant background and the lesser attenuation in pediatric patients, makes that this attenuation correction techniques are actually not applied. The geometric mean is the most common method, but that includes the acquisition of an anterior (extra) projection, which it is not acquired by a large number of NM departments. This method and the attenuation factors proposed by Tonnesen will be correlated with the absence of attenuation correction procedures. Material and Methods: Images from 20 individuals (aged 3 years +/- 2) were used and the two attenuation correction methods applied. The mean time of acquisition (time post DMSA administration) was 3.5 hours +/- 0.8h. Results: The absence of attenuation correction showed a good correlation with both attenuation methods (r=0.73 +/- 0.11) and the mean difference verified on the uptake values between the different methods were 4 +/- 3. The correlation was higher when the age was lower. The attenuation correction methods correlation was higher between them two than with the “no attenuation correction” method (r=0.82 +/- 0.8), and the mean differences of the uptake values were 2 +/- 2. Conclusion: The decision of not doing any kind of attenuation correction method can be justified by the minor differences verified on the relative kidney uptake values. Nevertheless, if it is recognized that there is a need for an accurate value of the relative kidney uptake, then an attenuation correction method should be used. Attenuation correction factors proposed by Tonnesen can be easily implemented and so become a practical and easy to implement alternative, namely when the anterior projection - needed for the geometric mean methodology – is not acquired.
Resumo:
Applications with soft real-time requirements can benefit from code mobility mechanisms, as long as those mechanisms support the timing and Quality of Service requirements of applications. In this paper, a generic model for code mobility mechanisms is presented. The proposed model gives system designers the necessary tools to perform a statistical timing analysis on the execution of the mobility mechanisms that can be used to determine the impact of code mobility in distributed real-time applications.
Resumo:
This project was developed within the ART-WiSe framework of the IPP-HURRAY group (http://www.hurray.isep.ipp.pt), at the Polytechnic Institute of Porto (http://www.ipp.pt). The ART-WiSe – Architecture for Real-Time communications in Wireless Sensor networks – framework (http://www.hurray.isep.ipp.pt/art-wise) aims at providing new communication architectures and mechanisms to improve the timing performance of Wireless Sensor Networks (WSNs). The architecture is based on a two-tiered protocol structure, relying on existing standard communication protocols, namely IEEE 802.15.4 (Physical and Data Link Layers) and ZigBee (Network and Application Layers) for Tier 1 and IEEE 802.11 for Tier 2, which serves as a high-speed backbone for Tier 1 without energy consumption restrictions. Within this trend, an application test-bed is being developed with the objectives of implementing, assessing and validating the ART-WiSe architecture. Particularly for the ZigBee protocol case; even though there is a strong commercial lobby from the ZigBee Alliance (http://www.zigbee.org), there is neither an open source available to the community for this moment nor publications on its adequateness for larger-scale WSN applications. This project aims at fulfilling these gaps by providing: a deep analysis of the ZigBee Specification, mainly addressing the Network Layer and particularly its routing mechanisms; an identification of the ambiguities and open issues existent in the ZigBee protocol standard; the proposal of solutions to the previously referred problems; an implementation of a subset of the ZigBee Network Layer, namely the association procedure and the tree routing on our technological platform (MICAz motes, TinyOS operating system and nesC programming language) and an experimental evaluation of that routing mechanism for WSNs.
Resumo:
The most common techniques for stress analysis/strength prediction of adhesive joints involve analytical or numerical methods such as the Finite Element Method (FEM). However, the Boundary Element Method (BEM) is an alternative numerical technique that has been successfully applied for the solution of a wide variety of engineering problems. This work evaluates the applicability of the boundary elem ent code BEASY as a design tool to analyze adhesive joints. The linearity of peak shear and peel stresses with the applied displacement is studied and compared between BEASY and the analytical model of Frostig et al., considering a bonded single-lap joint under tensile loading. The BEM results are also compared with FEM in terms of stress distributions. To evaluate the mesh convergence of BEASY, the influence of the mesh refinement on peak shear and peel stress distributions is assessed. Joint stress predictions are carried out numerically in BEASY and ABAQUS®, and analytically by the models of Volkersen, Goland, and Reissner and Frostig et al. The failure loads for each model are compared with experimental results. The preparation, processing, and mesh creation times are compared for all models. BEASY results presented a good agreement with the conventional methods.
Resumo:
Environmental nanoremediation of various contaminants has been reported in several recent studies. In this paper, the state of the art on the use of nanoparticles in soil and groundwater remediation processes is presented. There is a substantive body of evidence on the growing and successful application of nanoremediation for a diversity of soil and groundwater contamination contexts, particularly, for heavy metals, other inorganic contaminants, organic contaminants and emerging contaminants, as pharmaceutical and personal care products. This review confirms the competence of the use of nanoparticles in the remediation of contaminated media and the prevalent use of iron based nanoparticles.
Resumo:
The use of buffers to maintain the pH within a desired range is a very common practice in chemical, biochemical and biological studies. Among them, zwitterionic N-substituted aminosulfonic acids, usually known as Good’s buffers, although widely used, can complex metals and interact with biological systems. The present work reviews, discusses and updates the metal complexation characteristics of thirty one commercially available buffers. In addition, their impact on biological systems is also presented. The influences of these buffers on the results obtained in biological, biochemical and environmental studies, with special focus on their interaction with metal ions, are highlighted and critically reviewed. Using chemical speciation simulations, based on the current knowledge of the metal–buffer stability constants, a proposal of the most adequate buffer to employ for a given metal ion is presented.
Resumo:
In health related research it is common to have multiple outcomes of interest in a single study. These outcomes are often analysed separately, ignoring the correlation between them. One would expect that a multivariate approach would be a more efficient alternative to individual analyses of each outcome. Surprisingly, this is not always the case. In this article we discuss different settings of linear models and compare the multivariate and univariate approaches. We show that for linear regression models, the estimates of the regression parameters associated with covariates that are shared across the outcomes are the same for the multivariate and univariate models while for outcome-specific covariates the multivariate model performs better in terms of efficiency.