860 resultados para Data management and analyses
Resumo:
Consumer awareness, pesticide and fertilizer contaminations and environmental concerns have resulted in significant demand for organically grown farm produce. Consumption of berries has become popular among health-conscious consumers due to the high levels of valuable antioxidants, such as anthocyanins and other phenolic compounds. The present study evaluated the influence that organic farming (OF) and integrated pest management (IPM) practise exert on the total phenolic content in 22 strawberry samples from four varieties. Postharvest performance of OF and IPM strawberries grown in the same area in the centre of Portugal and harvested at the same maturity stage were compared. Chemical profiles (phenolic compounds) were determined with the aid of HPLC-DAD/MS. Total phenolic content was higher for OF strawberry extracts. This study showed that the main differences in bioactive phytochemicals between organically and IPM grown strawberries concerned their anthocyanin levels. Organically grown strawberries were significantly higher in antioxidant activity than were the IPM strawberries, as measured by DPPH and FRAP assays.
Resumo:
In general, modern networks are analysed by taking several Key Performance Indicators (KPIs) into account, their proper balance being required in order to guarantee a desired Quality of Service (QoS), particularly, cellular wireless heterogeneous networks. A model to integrate a set of KPIs into a single one is presented, by using a Cost Function that includes these KPIs, providing for each network node a single evaluation parameter as output, and reflecting network conditions and common radio resource management strategies performance. The proposed model enables the implementation of different network management policies, by manipulating KPIs according to users' or operators' perspectives, allowing for a better QoS. Results show that different policies can in fact be established, with a different impact on the network, e.g., with median values ranging by a factor higher than two.
Resumo:
Pesticides are among the most widely used chemicals in the world. Because of the widespread use of agricultural chemicals in food production, people are exposed to low levels of pesticide residues through their diets. Scientists do not yet have a total understanding of the health effects of these pesticide residues. This work aims to determine differences in terms of pesticide residue content in Portuguese strawberries grown using different agriculture practices. The Quick, Easy, Cheap, Effective, Rugged, and Safe sample preparation method was conducted and shown to have good performance for multiclass pesticides extraction in strawberries. The screening of 25 pesticides residue was performed by gas chromatography–tandem mass spectrometry. In quantitative validation, acceptable performances were achieved with recoveries of 70–120 and <12 % residual standard deviation for 25 pesticides. Good linearity was obtained for all the target compounds, with highly satisfactory repeatability. The limits of detection were in the range of 0.1–28 μg/kg. The method was applied to analyze strawberry samples from organic and integrated pest management (IPM) practices harvested in 2009–2010. The results showed the presence of fludioxonil, bifenthrin, mepanipyrim, tolylfluanid, cyprodinil, tetraconazole, and malathion when using IPM below the maximum residue levels.
Resumo:
The knowledge-based society we live in has stressed the importance of human capital and brought talent to the top of most wanted skills, especially to companies who want to succeed in turbulent environments worldwide. In fact, streams, sequences of decisions and resource commitments characterize the day-to-day of multinational companies (MNCs). Such decision-making activities encompass major strategic moves like internationalization and new market entries or diversification and acquisitions. In most companies, these strategic decisions are extensively discussed and debated and are generally framed, formulated, and articulated in specialized language often developed by the best minds in the company. Yet the language used in such deliberations, in detailing and enacting the implementation strategy is usually taken for granted and receives little if any explicit attention (Brannen & Doz, 2012) an can still be a “forgotten factor” (Marschan et al. 1997). Literature on language management and international business refers to lack of awareness of business managers of the impact that language can have not only in communication effectiveness but especially in knowledge transfer and knowledge management in business environments. In the context of MNCs, management is, for many different reasons, more complex and demanding than that of a national company, mainly because of diversity factors inherent to internationalization, namely geographical and cultural spaces, i.e, varied mindsets. Moreover, the way of functioning, and managing language, of the MNC depends on its vision, its values and its internationalization model, i.e on in the way the MNE adapts to and controls the new markets, which can vary essentially from a more ethnocentric to a more pluricentric focus. Regardless of the internationalization model followed by the MNC, communication between different business units is essential to achieve unity in diversity and business sustainability. For the business flow and prosperity, inter-subsidiary, intra-company and company-client (customers, suppliers, governments, municipalities, etc..) communication must work in various directions and levels of the organization. If not well managed, this diversity can be a barrier to global coordination and create turbulent environments, even if a good technological support is available (Feely et al., 2002: 4). According to Marchan-Piekkari (1999) the tongue can be both (i) a barrier, (ii) a facilitator and (iii) a source of power. Moreover, the lack of preparation for the barriers of linguistic diversity can lead to various costs, including negotiations’ failure and failure on internationalization.. On the other hand, communication and language fluency is not just a message transfer procedure, but above all a knowledge transfer process, which requires extra-linguistic skills (persuasion, assertiveness …) in order to promote credibility of both parties. For this reason, MNCs need a common code to communicate and trade information inside and outside the company, which will require one or more strategies, in order to overcome possible barriers and organization distortions.
Resumo:
This paper discusses the results of applied research on the eco-driving domain based on a huge data set produced from a fleet of Lisbon's public transportation buses for a three-year period. This data set is based on events automatically extracted from the control area network bus and enriched with GPS coordinates, weather conditions, and road information. We apply online analytical processing (OLAP) and knowledge discovery (KD) techniques to deal with the high volume of this data set and to determine the major factors that influence the average fuel consumption, and then classify the drivers involved according to their driving efficiency. Consequently, we identify the most appropriate driving practices and styles. Our findings show that introducing simple practices, such as optimal clutch, engine rotation, and engine running in idle, can reduce fuel consumption on average from 3 to 5l/100 km, meaning a saving of 30 l per bus on one day. These findings have been strongly considered in the drivers' training sessions.
Resumo:
Trabalho apresentado no âmbito do Mestrado em Engenharia Informática, como requisito parcial para obtenção do grau de Mestre em Engenharia Informática.
Resumo:
The development of high spatial resolution airborne and spaceborne sensors has improved the capability of ground-based data collection in the fields of agriculture, geography, geology, mineral identification, detection [2, 3], and classification [4–8]. The signal read by the sensor from a given spatial element of resolution and at a given spectral band is a mixing of components originated by the constituent substances, termed endmembers, located at that element of resolution. This chapter addresses hyperspectral unmixing, which is the decomposition of the pixel spectra into a collection of constituent spectra, or spectral signatures, and their corresponding fractional abundances indicating the proportion of each endmember present in the pixel [9, 10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. The linear mixing model holds when the mixing scale is macroscopic [13]. The nonlinear model holds when the mixing scale is microscopic (i.e., intimate mixtures) [14, 15]. The linear model assumes negligible interaction among distinct endmembers [16, 17]. The nonlinear model assumes that incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [18]. Under the linear mixing model and assuming that the number of endmembers and their spectral signatures are known, hyperspectral unmixing is a linear problem, which can be addressed, for example, under the maximum likelihood setup [19], the constrained least-squares approach [20], the spectral signature matching [21], the spectral angle mapper [22], and the subspace projection methods [20, 23, 24]. Orthogonal subspace projection [23] reduces the data dimensionality, suppresses undesired spectral signatures, and detects the presence of a spectral signature of interest. The basic concept is to project each pixel onto a subspace that is orthogonal to the undesired signatures. As shown in Settle [19], the orthogonal subspace projection technique is equivalent to the maximum likelihood estimator. This projection technique was extended by three unconstrained least-squares approaches [24] (signature space orthogonal projection, oblique subspace projection, target signature space orthogonal projection). Other works using maximum a posteriori probability (MAP) framework [25] and projection pursuit [26, 27] have also been applied to hyperspectral data. In most cases the number of endmembers and their signatures are not known. Independent component analysis (ICA) is an unsupervised source separation process that has been applied with success to blind source separation, to feature extraction, and to unsupervised recognition [28, 29]. ICA consists in finding a linear decomposition of observed data yielding statistically independent components. Given that hyperspectral data are, in given circumstances, linear mixtures, ICA comes to mind as a possible tool to unmix this class of data. In fact, the application of ICA to hyperspectral data has been proposed in reference 30, where endmember signatures are treated as sources and the mixing matrix is composed by the abundance fractions, and in references 9, 25, and 31–38, where sources are the abundance fractions of each endmember. In the first approach, we face two problems: (1) The number of samples are limited to the number of channels and (2) the process of pixel selection, playing the role of mixed sources, is not straightforward. In the second approach, ICA is based on the assumption of mutually independent sources, which is not the case of hyperspectral data, since the sum of the abundance fractions is constant, implying dependence among abundances. This dependence compromises ICA applicability to hyperspectral images. In addition, hyperspectral data are immersed in noise, which degrades the ICA performance. IFA [39] was introduced as a method for recovering independent hidden sources from their observed noisy mixtures. IFA implements two steps. First, source densities and noise covariance are estimated from the observed data by maximum likelihood. Second, sources are reconstructed by an optimal nonlinear estimator. Although IFA is a well-suited technique to unmix independent sources under noisy observations, the dependence among abundance fractions in hyperspectral imagery compromises, as in the ICA case, the IFA performance. Considering the linear mixing model, hyperspectral observations are in a simplex whose vertices correspond to the endmembers. Several approaches [40–43] have exploited this geometric feature of hyperspectral mixtures [42]. Minimum volume transform (MVT) algorithm [43] determines the simplex of minimum volume containing the data. The MVT-type approaches are complex from the computational point of view. Usually, these algorithms first find the convex hull defined by the observed data and then fit a minimum volume simplex to it. Aiming at a lower computational complexity, some algorithms such as the vertex component analysis (VCA) [44], the pixel purity index (PPI) [42], and the N-FINDR [45] still find the minimum volume simplex containing the data cloud, but they assume the presence in the data of at least one pure pixel of each endmember. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. Hyperspectral sensors collects spatial images over many narrow contiguous bands, yielding large amounts of data. For this reason, very often, the processing of hyperspectral data, included unmixing, is preceded by a dimensionality reduction step to reduce computational complexity and to improve the signal-to-noise ratio (SNR). Principal component analysis (PCA) [46], maximum noise fraction (MNF) [47], and singular value decomposition (SVD) [48] are three well-known projection techniques widely used in remote sensing in general and in unmixing in particular. The newly introduced method [49] exploits the structure of hyperspectral mixtures, namely the fact that spectral vectors are nonnegative. The computational complexity associated with these techniques is an obstacle to real-time implementations. To overcome this problem, band selection [50] and non-statistical [51] algorithms have been introduced. This chapter addresses hyperspectral data source dependence and its impact on ICA and IFA performances. The study consider simulated and real data and is based on mutual information minimization. Hyperspectral observations are described by a generative model. This model takes into account the degradation mechanisms normally found in hyperspectral applications—namely, signature variability [52–54], abundance constraints, topography modulation, and system noise. The computation of mutual information is based on fitting mixtures of Gaussians (MOG) to data. The MOG parameters (number of components, means, covariances, and weights) are inferred using the minimum description length (MDL) based algorithm [55]. We study the behavior of the mutual information as a function of the unmixing matrix. The conclusion is that the unmixing matrix minimizing the mutual information might be very far from the true one. Nevertheless, some abundance fractions might be well separated, mainly in the presence of strong signature variability, a large number of endmembers, and high SNR. We end this chapter by sketching a new methodology to blindly unmix hyperspectral data, where abundance fractions are modeled as a mixture of Dirichlet sources. This model enforces positivity and constant sum sources (full additivity) constraints. The mixing matrix is inferred by an expectation-maximization (EM)-type algorithm. This approach is in the vein of references 39 and 56, replacing independent sources represented by MOG with mixture of Dirichlet sources. Compared with the geometric-based approaches, the advantage of this model is that there is no need to have pure pixels in the observations. The chapter is organized as follows. Section 6.2 presents a spectral radiance model and formulates the spectral unmixing as a linear problem accounting for abundance constraints, signature variability, topography modulation, and system noise. Section 6.3 presents a brief resume of ICA and IFA algorithms. Section 6.4 illustrates the performance of IFA and of some well-known ICA algorithms with experimental data. Section 6.5 studies the ICA and IFA limitations in unmixing hyperspectral data. Section 6.6 presents results of ICA based on real data. Section 6.7 describes the new blind unmixing scheme and some illustrative examples. Section 6.8 concludes with some remarks.
Resumo:
Environmental Training in Engineering Education (ENTREE 2001) - integrated green policies: progress for progress, p. 329-339 (Florence, 14-17 November 2001; proceedings published as book)
Resumo:
The rising usage of distributed energy resources has been creating several problems in power systems operation. Virtual Power Players arise as a solution for the management of such resources. Additionally, approaching the main network as a series of subsystems gives birth to the concepts of smart grid and micro grid. Simulation, particularly based on multi-agent technology is suitable to model all these new and evolving concepts. MASGriP (Multi-Agent Smart Grid simulation Platform) is a system that was developed to allow deep studies of the mentioned concepts. This paper focuses on a laboratorial test bed which represents a house managed by a MASGriP player. This player is able to control a real installation, responding to requests sent by the system operators and reacting to observed events depending on the context.
Resumo:
The recent changes concerning the consumers’ active participation in the efficient management of load devices for one’s own interest and for the interest of the network operator, namely in the context of demand response, leads to the need for improved algorithms and tools. A continuous consumption optimization algorithm has been improved in order to better manage the shifted demand. It has been done in a simulation and user-interaction tool capable of being integrated in a multi-agent smart grid simulator already developed, and also capable of integrating several optimization algorithms to manage real and simulated loads. The case study of this paper enhances the advantages of the proposed algorithm and the benefits of using the developed simulation and user interaction tool.
Resumo:
To assure enduring success, firms need to generate economic value with respect for the environment and social value. They also need to be aware of the needs and expectations of relevant stakeholders and incorporate them in their business strategies and programs. These challenges imply that engineers should take into consideration societal, health and safety,environmental and commercial issues in their professional activity. This investigation accesses the influence of firms’ environmental management programs and community involvement programs on their own employees and in the community, with a focus on small and medium companies. Based on a quantitative research, the findings suggest that firms that invest both in environmental management programs and in community involvement programs have a higher involvement of their own employees with the community, while at the same time receiving more feedback (positive, but also negative) from the community, stressing the need to pay special attention to their communication policies.
Resumo:
Fasciolosis is a disease of importance for both veterinary and public health. For the first time, georeferenced prevalence data of Fasciola hepatica in bovines were collected and mapped for the Brazilian territory and data availability was discussed. Bovine fasciolosis in Brazil is monitored on a Federal, State and Municipal level, and to improve monitoring it is essential to combine the data collected on these three levels into one dataset. Data were collected for 1032 municipalities where livers were condemned by the Federal Inspection Service (MAPA/SIF) because of the presence of F. hepatica. The information was distributed over 11 states: Espírito Santo, Goiás, Minas Gerais, Mato Grosso do Sul, Mato Grosso, Pará, Paraná, Rio de Janeiro, Rio Grande do Sul, Santa Catarina and São Paulo. The highest prevalence of fasciolosis was observed in the southern states, with disease clusters along the coast of Paraná and Santa Catarina and in Rio Grande do Sul. Also, temporal variation of the prevalence was observed. The observed prevalence and the kriged prevalence maps presented in this paper can assist both animal and human health workers in estimating the risk of infection in their state or municipality.
Resumo:
Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Informática
Resumo:
Requirements Engineering has been acknowledged an essential discipline for Software Quality. Poorly-defined processes for eliciting, analyzing, specifying and validating requirements can lead to unclear issues or misunderstandings on business needs and project’s scope. These typically result in customers’ non-satisfaction with either the products’ quality or the increase of the project’s budget and duration. Maturity models allow an organization to measure the quality of its processes and improve them according to an evolutionary path based on levels. The Capability Maturity Model Integration (CMMI) addresses the aforementioned Requirements Engineering issues. CMMI defines a set of best practices for process improvement that are divided into several process areas. Requirements Management and Requirements Development are the process areas concerned with Requirements Engineering maturity. Altran Portugal is a consulting company concerned with the quality of its software. In 2012, the Solution Center department has developed and applied successfully a set of processes aligned with CMMI-DEV v1.3, what granted them a Level 2 maturity certification. For 2015, they defined an organizational goal of addressing CMMI-DEV maturity level 3. This MSc dissertation is part of this organization effort. In particular, it is concerned with the required process areas that address the activities of Requirements Engineering. Our main goal is to contribute for the development of Altran’s internal engineering processes to conform to the guidelines of the Requirements Development process area. Throughout this dissertation, we started with an evaluation method based on CMMI and conducted a compliance assessment of Altran’s current processes. This allowed demonstrating their alignment with the CMMI Requirements Management process area and to highlight the improvements needed to conform to the Requirements Development process area. Based on the study of alternative solutions for the gaps found, we proposed a new Requirements Management and Development process that was later validated using three different approaches. The main contribution of this dissertation is the new process developed for Altran Portugal. However, given that studies on these topics are not abundant in the literature, we also expect to contribute with useful evidences to the existing body of knowledge with a survey on CMMI and requirements engineering trends. Most importantly, we hope that the implementation of the proposed processes’ improvements will minimize the risks of mishandled requirements, increasing Altran’s performance and taking them one step further to the desired maturity level.