27 resultados para Process-dissociation Framework
em Repositório Científico do Instituto Politécnico de Lisboa - Portugal
Resumo:
Dissertação apresentada à Escola Superior de Educação de Lisboa para obtenção de grau de mestre em Ciências da Educação - Especialização Supervisão em Educação
Resumo:
Processes are a central entity in enterprise collaboration. Collaborative processes need to be executed and coordinated in a distributed Computational platform where computers are connected through heterogeneous networks and systems. Life cycle management of such collaborative processes requires a framework able to handle their diversity based on different computational and communication requirements. This paper proposes a rational for such framework, points out key requirements and proposes it strategy for a supporting technological infrastructure. Beyond the portability of collaborative process definitions among different technological bindings, a framework to handle different life cycle phases of those definitions is presented and discussed. (c) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Workflows have been successfully applied to express the decomposition of complex scientific applications. This has motivated many initiatives that have been developing scientific workflow tools. However the existing tools still lack adequate support to important aspects namely, decoupling the enactment engine from workflow tasks specification, decentralizing the control of workflow activities, and allowing their tasks to run autonomous in distributed infrastructures, for instance on Clouds. Furthermore many workflow tools only support the execution of Direct Acyclic Graphs (DAG) without the concept of iterations, where activities are executed millions of iterations during long periods of time and supporting dynamic workflow reconfigurations after certain iteration. We present the AWARD (Autonomic Workflow Activities Reconfigurable and Dynamic) model of computation, based on the Process Networks model, where the workflow activities (AWA) are autonomic processes with independent control that can run in parallel on distributed infrastructures, e. g. on Clouds. Each AWA executes a Task developed as a Java class that implements a generic interface allowing end-users to code their applications without concerns for low-level details. The data-driven coordination of AWA interactions is based on a shared tuple space that also enables support to dynamic workflow reconfiguration and monitoring of the execution of workflows. We describe how AWARD supports dynamic reconfiguration and discuss typical workflow reconfiguration scenarios. For evaluation we describe experimental results of AWARD workflow executions in several application scenarios, mapped to a small dedicated cluster and the Amazon (Elastic Computing EC2) Cloud.
Resumo:
Workflows have been successfully applied to express the decomposition of complex scientific applications. However the existing tools still lack adequate support to important aspects namely, decoupling the enactment engine from tasks specification, decentralizing the control of workflow activities allowing their tasks to run in distributed infrastructures, and supporting dynamic workflow reconfigurations. We present the AWARD (Autonomic Workflow Activities Reconfigurable and Dynamic) model of computation, based on Process Networks, where the workflow activities (AWA) are autonomic processes with independent control that can run in parallel on distributed infrastructures. Each AWA executes a task developed as a Java class with a generic interface allowing end-users to code their applications without low-level details. The data-driven coordination of AWA interactions is based on a shared tuple space that also enables dynamic workflow reconfiguration. For evaluation we describe experimental results of AWARD workflow executions in several application scenarios, mapped to the Amazon (Elastic Computing EC2) Cloud.
Resumo:
An education promoting scientific literacy (SL) that prepares the citizens to a responsible citizenship has persisted as an argument across discussions on curricula design. The ubiquity of science and technology on contemporary societies and the ideological requirement of informed democratic participation led to the identification of relevant categories that drive curriculum reforms towards a humanistic approach of school science. The category ‘Science as culture’ acquires in the current work a major importance: it enlightens the meaning of scientific literacy. Looking closely to the French term, culture scientifique et tecnologique, turns science simultaneously into a cultural object and product that can be both received and worked at different levels and within several approaches by the individuals and the communities. On the other hand, nonformal and informal education spaces gain greater importance. Together with the formal school environment these spaces allow for an enrichment and diversification of learning experiences. Examples of nonformal spaces where animators can develop their work may be science museums or botanical gardens; television and internet can be regarded as informal education spaces. Due to the above mentioned impossibility of setting apart the individual or community-based experiences from Science and Technology (S&T), the work in nonformal and informal spaces sets an additional challenge to the preparation of socio-cultural animators. Socio-scientific issues take, at times, heavily relevance within the communities. Pollution, high tension lines, spreading of diseases, food contamination or natural resources conservation are among the socio-scientific issues that often call upon arguments and emotions. In the context of qualifying programmes on socio-cultural animation (social education and community development) within European Higher Education Area (EHEA) the present study describes the Portuguese framework. The comparison of programmes within Portugal aims to contribute to the discussion on the curriculum design for a socio-cultural animator degree (1st cycle of Bologna process). In particular, this study intends to assess how the formation given complies with enabling animators to work, within multiple scenarios, with communities in situations of socio-scientific relevance. A set of themes, issues and both current and potential fields of action, not described or insufficiently described in literature, is identified and analysed in the perspective of a qualified intervention of animators. One of these examples is thoroughly discussed. Finally, suggestions are made about curriculum reforms in order, if possible, to strongly link the desired qualified intervention with a qualifying formation.
Resumo:
No início da década de 90, as empresas começaram a sentir a necessidade de melhorar o acesso à informação das suas actividades para auxiliar na tomada de decisões. Desta forma, no mundo da informática, emergiu o sector Business Intelligence (BI) composto inicialmente por data warehousing e ferramentas de geração de relatórios. Ao longo dos anos o conceito de BI evoluiu de acordo com as necessidades empresariais, tornando a análise das actividades e do desempenho das organizações em aspectos críticos na gestão das mesmas. A área de BI abrange diversos sectores, sendo o de geração de relatórios e o de análise de dados aqueles que melhor preenchem os requisitos pretendidos no controlo de acesso à informação do negócio e respectivos processos. Actualmente o tempo e a informação são vantagens competitivas e por esse mesmo motivo as empresas estão cada vez mais preocupadas com o facto de o aumento do volume de informação estar a tornar-se insustentável na medida que o tempo necessário para processar a informação é cada vez maior. Por esta razão muitas empresas de software, tais como Microsoft, IBM e Oracle estão numa luta por um lugar neste mercado de BI em expansão. Para que as empresas possam ser competitivas, a sua capacidade de previsão e resposta às necessidades de mercado em tempo real é requisito principal, em detrimento da existência apenas de uma reacção a uma necessidade que peca por tardia. Os produtos de BI têm fama de trabalharem apenas com dados históricos armazenados, o que faz com que as empresas não se possam basear nessas soluções quando o requisito de alguns negócios é de tempo quase real. A latência introduzida por um data warehouse é demasiada para que o desempenho seja aceitável. Desta forma, surge a tecnologia Business Activity Monitoring (BAM) que fornece análise de dados e alertas em tempo quase real sobre os processos do negócio, utilizando fontes de dados como Web Services, filas de mensagens, etc. O conceito de BAM surgiu em Julho de 2001 pela organização Gartner, sendo uma extensão orientada a eventos da área de BI. O BAM define-se pelo acesso em tempo real aos indicadores de desempenho de negócios com o intuito de aumentar a velocidade e eficácia dos processos de negócio. As soluções BAM estão a tornar-se cada vez mais comuns e sofisticadas.
Resumo:
Nowadays, the Portuguese insurance industry operates in a market with a much more aggressive structure than a few decades ago. Markets and the economy have become globalised since the last decade of the 20th century. Market forces have gradually shifted – power is now mainly on the demand side. In order to meet the new requirements, the insurance industry must develop a strong strategic ability to respond to constant changes of the new international economic order.One of the basic aspects of this strategic development will focus on the ability to predict the future. We introduce the subject by briefly describing the sector, its organisational structure in the Portuguese market, and challenges arising from the development of the European Union. We then analyse the economic and financial structure of the sector. From this point of view, we aim at the possibility of designing models that could explain the demand for insurance, claims and technical reserves evolution. Such models, (even if based on the past), would resolve, at least partly, one of the greatest difficulties experienced by insurance companies when estimating the budget. Thus, we examine the existence of variables that explain the previous points, which are capable of forming a basis for designing models that are simple but efficient, and can be used for strategic planning.
Resumo:
This paper describes a multi-agent based simulation (MABS) framework to construct an artificial electric power market populated with learning agents. The artificial market, named TEMMAS (The Electricity Market Multi-Agent Simulator), explores the integration of two design constructs: (i) the specification of the environmental physical market properties and (ii) the specification of the decision-making (deliberative) and reactive agents. TEMMAS is materialized in an experimental setup involving distinct power generator companies that operate in the market and search for the trading strategies that best exploit their generating units' resources. The experimental results show a coherent market behavior that emerges from the overall simulated environment.
Resumo:
Fluorescent protein microscopy imaging is nowadays one of the most important tools in biomedical research. However, the resulting images present a low signal to noise ratio and a time intensity decay due to the photobleaching effect. This phenomenon is a consequence of the decreasing on the radiation emission efficiency of the tagging protein. This occurs because the fluorophore permanently loses its ability to fluoresce, due to photochemical reactions induced by the incident light. The Poisson multiplicative noise that corrupts these images, in addition with its quality degradation due to photobleaching, make long time biological observation processes very difficult. In this paper a denoising algorithm for Poisson data, where the photobleaching effect is explicitly taken into account, is described. The algorithm is designed in a Bayesian framework where the data fidelity term models the Poisson noise generation process as well as the exponential intensity decay caused by the photobleaching. The prior term is conceived with Gibbs priors and log-Euclidean potential functions, suitable to cope with the positivity constrained nature of the parameters to be estimated. Monte Carlo tests with synthetic data are presented to characterize the performance of the algorithm. One example with real data is included to illustrate its application.
Resumo:
One of the most efficient approaches to generate the side information (SI) in distributed video codecs is through motion compensated frame interpolation where the current frame is estimated based on past and future reference frames. However, this approach leads to significant spatial and temporal variations in the correlation noise between the source at the encoder and the SI at the decoder. In such scenario, it would be useful to design an architecture where the SI can be more robustly generated at the block level, avoiding the creation of SI frame regions with lower correlation, largely responsible for some coding efficiency losses. In this paper, a flexible framework to generate SI at the block level in two modes is presented: while the first mode corresponds to a motion compensated interpolation (MCI) technique, the second mode corresponds to a motion compensated quality enhancement (MCQE) technique where a low quality Intra block sent by the encoder is used to generate the SI by doing motion estimation with the help of the reference frames. The novel MCQE mode can be overall advantageous from the rate-distortion point of view, even if some rate has to be invested in the low quality Intra coding blocks, for blocks where the MCI produces SI with lower correlation. The overall solution is evaluated in terms of RD performance with improvements up to 2 dB, especially for high motion video sequences and long Group of Pictures (GOP) sizes.
Resumo:
Actualmente, não existem ferramentas open source de Business Intelligence (BI) para suporte à gestão e análise financeira nas empresas, de acordo com o sistema de normalização contabilística (SNC). As diferentes características de cada negócio, juntamente com os requisitos impostos pelo SNC, tornam complexa a criação de uma Framework financeira genérica, que satisfaça, de forma eficiente, as análises financeiras necessárias à gestão das empresas. O objectivo deste projecto é propor uma framework baseada em OLAP, capaz de dar suporte à gestão contabilística e análise financeira, recorrendo exclusivamente a software open source na sua implementação, especificamente, a plataforma Pentaho. Toda a informação contabilística, obtida através da contabilidade geral, da contabilidade analítica, da gestão orçamental e da análise financeira é armazenada num Data mart. Este Data mart suportará toda a análise financeira, incluindo a análise de desvios orçamentais e de fluxo de capitais, permitindo às empresas ter uma ferramenta de BI, compatível com o SNC, que as ajude na tomada de decisões.
Resumo:
This paper is an elaboration of the DECA algorithm [1] to blindly unmix hyperspectral data. The underlying mixing model is linear, meaning that each pixel is a linear mixture of the endmembers signatures weighted by the correspondent abundance fractions. The proposed method, as DECA, is tailored to highly mixed mixtures in which the geometric based approaches fail to identify the simplex of minimum volume enclosing the observed spectral vectors. We resort then to a statitistical framework, where the abundance fractions are modeled as mixtures of Dirichlet densities, thus enforcing the constraints on abundance fractions imposed by the acquisition process, namely non-negativity and constant sum. With respect to DECA, we introduce two improvements: 1) the number of Dirichlet modes are inferred based on the minimum description length (MDL) principle; 2) The generalized expectation maximization (GEM) algorithm we adopt to infer the model parameters is improved by using alternating minimization and augmented Lagrangian methods to compute the mixing matrix. The effectiveness of the proposed algorithm is illustrated with simulated and read data.
Resumo:
HENRE II (Higher Education Network for Radiography in Europe)
Resumo:
This article presents the design and test of a receiver front end aimed at LMDS applications at 28.5 GHz. It presents a system-level design after which the receiver was designed. The receiver comprises an LNA, quadrature mixer and quadrature local oscillator. Experimental results at 24 GHz center frequency show a conversion voltage gain of 15 dB and conversion noise figure of 14 5 dB. The receiver operates from a 2 5 V power supply with a total current consumption of 31 mA.
Resumo:
Nowadays, the cooperative intelligent transport systems are part of a largest system. Transportations are modal operations integrated in logistics and, logistics is the main process of the supply chain management. The supply chain strategic management as a simultaneous local and global value chain is a collaborative/cooperative organization of stakeholders, many times in co-opetition, to perform a service to the customers respecting the time, place, price and quality levels. The transportation, like other logistics operations must add value, which is achieved in this case through compression lead times and order fulfillments. The complex supplier's network and the distribution channels must be efficient and the integral visibility (monitoring and tracing) of supply chain is a significant source of competitive advantage. Nowadays, the competition is not discussed between companies but among supply chains. This paper aims to evidence the current and emerging manufacturing and logistics system challenges as a new field of opportunities for the automation and control systems research community. Furthermore, the paper forecasts the use of radio frequency identification (RFID) technologies integrated into an information and communication technologies (ICT) framework based on distributed artificial intelligence (DAI) supported by a multi-agent system (MAS), as the most value advantage of supply chain management (SCM) in a cooperative intelligent logistics systems. Logistical platforms (production or distribution) as nodes of added value of supplying and distribution networks are proposed as critical points of the visibility of the inventory, where these technological needs are more evident.