924 resultados para Distributed process model
Resumo:
The spectral response and the photocurrent delivered by entirely microcrystalline p-i-n-Si:H detectors an analysed under different applied bias and light illumination conditions. The spectral response and the internal collection depend not only on the energy range but also on the illumination side. Under [p]- and [n]-side irradiation, the internal collection characteristics have an atypical shape. It is high for applied bias and lower than the open circuit voltage, shows a steep decrease near the open circuit voltage (higher under [n]-side illumination) and levels off for higher voltages. Additionally, the numerical modeling of the VIS/NIR detector, based on the band discontinuities near the grain boundaries and interfaces, complements the study and gives insight into the internal physical process.
Resumo:
The growth experimented in recent years in both the variety and volume of structured products implies that banks and other financial institutions have become increasingly exposed to model risk. In this article we focus on the model risk associated with the local volatility (LV) model and with the Variance Gamma (VG) model. The results show that the LV model performs better than the VG model in terms of its ability to match the market prices of European options. Nevertheless, both models are subject to significant pricing errors when compared with the stochastic volatility framework.
Resumo:
This article describes the main research results in a new methodology, in which the stages and strategies of the technology integration process are identified and described. A set of principles and recommendations are therefore presented. The MIPO model described in this paper is a result of the effort made regarding the understanding of the main success features of good practices, in the web environment, integrated in the information systems/information technology context. The initial model has been created, based on experiences and literature review. After that, it was tested in the information and technology system units at higher school and also adapted as a result of four cycles of an actionresearch work combined with a case study research. The information, concepts and procedures presented here give support to teachers and instructors, instructional designers and planning teams – anyone who wants to develop effective b‐learning instructions.
Resumo:
Value has been defined in different theoretical contexts as need, desire, interest, standard /criteria, beliefs, attitudes, and preferences. The creation of value is key to any business, and any business activity is about exchanging some tangible and/or intangible good or service and having its value accepted and rewarded by customers or clients, either inside the enterprise or collaborative network or outside. “Perhaps surprising then is that firms often do not know how to define value, or how to measure it” (Anderson and Narus, 1998 cited by [1]). Woodruff echoed that we need “richer customer value theory” for providing an “important tool for locking onto the critical things that managers need to know”. In addition, he emphasized, “we need customer value theory that delves deeply into customer’s world of product use in their situations” [2]. In this sense, we proposed and validated a novel “Conceptual Model for Decomposing the Value for the Customer”. To this end, we were aware that time has a direct impact on customer perceived value, and the suppliers’ and customers’ perceptions change from the pre-purchase to the post-purchase phases, causing some uncertainty and doubts.We wanted to break down value into all its components, as well as every built and used assets (both endogenous and/or exogenous perspectives). This component analysis was then transposed into a mathematical formulation using the Fuzzy Analytic Hierarchy Process (AHP), so that the uncertainty and vagueness of value perceptions could be embedded in this model that relates used and built assets in the tangible and intangible deliverable exchange among the involved parties, with their actual value perceptions.
Resumo:
The large increase of distributed energy resources, including distributed generation, storage systems and demand response, especially in distribution networks, makes the management of the available resources a more complex and crucial process. With wind based generation gaining relevance, in terms of the generation mix, the fact that wind forecasting accuracy rapidly drops with the increase of the forecast anticipation time requires to undertake short-term and very short-term re-scheduling so the final implemented solution enables the lowest possible operation costs. This paper proposes a methodology for energy resource scheduling in smart grids, considering day ahead, hour ahead and five minutes ahead scheduling. The short-term scheduling, undertaken five minutes ahead, takes advantage of the high accuracy of the very-short term wind forecasting providing the user with more efficient scheduling solutions. The proposed method uses a Genetic Algorithm based approach for optimization that is able to cope with the hard execution time constraint of short-term scheduling. Realistic power system simulation, based on PSCAD , is used to validate the obtained solutions. The paper includes a case study with a 33 bus distribution network with high penetration of distributed energy resources implemented in PSCAD .
Resumo:
In the energy management of a small power system, the scheduling of the generation units is a crucial problem for which adequate methodologies can maximize the performance of the energy supply. This paper proposes an innovative methodology for distributed energy resources management. The optimal operation of distributed generation, demand response and storage resources is formulated as a mixed-integer linear programming model (MILP) and solved by a deterministic optimization technique CPLEX-based implemented in General Algebraic Modeling Systems (GAMS). The paper deals with a vision for the grids of the future, focusing on conceptual and operational aspects of electrical grids characterized by an intensive penetration of DG, in the scope of competitive environments and using artificial intelligence methodologies to attain the envisaged goals. These concepts are implemented in a computational framework which includes both grid and market simulation.
Resumo:
The Bologna Process aimed to build a European Higher Education Area promoting student's mobility. The adoption of Bologna Declaration directives requires a self management distributed approach to deal with student's mobility, allowing frequent updates in institutions rules or legislation. This paper suggests a computational system architecture, which follows a social network design. A set of structured annotations is proposed in order to organize the user's information. For instance, when the user is a student its annotations are organized into an academic record. The academic record data is used to discover interests, namely mobility interests, among students that belongs the academic network. These ideas have been applied into a demonstrator that includes a mobility simulator to compare and show the student's academic evolution.
Resumo:
In distributed video coding, motion estimation is typically performed at the decoder to generate the side information, increasing the decoder complexity while providing low complexity encoding in comparison with predictive video coding. Motion estimation can be performed once to create the side information or several times to refine the side information quality along the decoding process. In this paper, motion estimation is performed at the decoder side to generate multiple side information hypotheses which are adaptively and dynamically combined, whenever additional decoded information is available. The proposed iterative side information creation algorithm is inspired in video denoising filters and requires some statistics of the virtual channel between each side information hypothesis and the original data. With the proposed denoising algorithm for side information creation, a RD performance gain up to 1.2 dB is obtained for the same bitrate.
Resumo:
The Bologna Process aimed to build a European Higher Education Area with the objective of promoting students mobility. The adoption of Bologna Declaration directives requires a decentralized approach that accelerates student's mobility, based on frequently updated legislation. This paper proposes a student personal system to manage student's academic information. This system is supported by a flexible model that integrates, for instance, knowledge about the student attended courses or about a course that the student wishes to apply. Essentially, this model holds a (i) Student's Academic Record with skills acquired in academic course units, professional experience or training and an (ii) Individual Studies Plan, which places the student in a particular (iii) Course Plan setting the curricular structure that the student wishes to apply.
Resumo:
Copyright: © 2014 Aranda et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Resumo:
Em Portugal, as instituições de ensino superior dispõem de plataformas de e-learning que reflectem uma mais-valia para o processo de ensino-aprendizagem. No entanto, estas plataformas caracterizam-se por serem de âmbito privado expondo, desta forma, a tímida abertura das instituições na partilha do seu conhecimento, como também dos seus recursos. O paradigma Cloud Computing surge como uma solução, por exemplo, para a criação de uma federação de nuvens capaz de contemplar soluções heterogéneas, garantindo a interoperabilidade entre as plataformas das várias instituições de ensino, e promovendo os objectivos propostos pelo Processo de Bolonha, nomeadamente no que se refere à partilha de informação, de plataformas e serviços e promoção de projectos comuns. Neste âmbito, é necessário desenvolver ferramentas que permitam aos decisores ponderar as mais-valias deste novo paradigma. Assim, é conveniente quantificar o retorno esperado para o investimento, em recursos humanos e tecnológicos, exigido pelo modelo Cloud Computing. Este trabalho contribui para o estudo da avaliação do retorno do investimento (ROI) em infra-estruturas e serviços TIC (Tecnologias de Informação e Comunicação), resultante da análise de diferentes cenários relativos à introdução do paradigma Cloud Computing. Para tal, foi proposta uma metodologia de análise baseada num questionário, distribuído por diversas instituições de ensino superior portuguesas, contendo um conjunto de questões que permitiram identificar indicadores, e respectivas métricas, a usar na elaboração de modelos de estimação do ROI.
Resumo:
Chapter in Book Proceedings with Peer Review First Iberian Conference, IbPRIA 2003, Puerto de Andratx, Mallorca, Spain, JUne 4-6, 2003. Proceedings
Resumo:
We live in a changing world. At an impressive speed, every day new technological resources appear. We increasingly use the Internet to obtain and share information, and new online communication tools are emerging. Each of them encompasses new potential and creates new audiences. In recent years, we witnessed the emergence of Facebook, Twitter, YouTube and other media platforms. They have provided us with an even greater interactivity between sender and receiver, as well as generated a new sense of community. At the same time we also see the availability of content like it never happened before. We are increasingly sharing texts, videos, photos, etc. This poster intends to explore the potential of using these new online communication tools in the cultural sphere to create new audiences, to develop of a new kind of community, to provide information as well as different ways of building organizations’ memory. The transience of performing arts is accompanied by the need to counter that transience by means of documentation. This desire to ‘save’ events reaches its expression with the information archive of the different production moments as well as the opportunity to record the event and present it through, for instance, digital platforms. In this poster we intend to answer the following questions: which online communication tools are being used to engage audiences in the cultural sphere (specifically between theater companies in Lisbon)? Is there a new relationship with the public? Are online communication tools creating a new kind of community? What changes are these tools introducing in the creative process? In what way the availability of content and its archive contribute to the organization memory? Among several references, we will approach the two-way communication model that James E. Grunig & Todd T. Hunt (1984) already presented and the concept of mass self-communication of Manuel Castells (2010). Castells also tells us that we have moved from traditional media to a system of communication networks. For Scott Kirsner (2010), we have entered an era of digital creativity, where artists have the tools to do what they imagined and the public no longer wants to just consume cultural goods, but instead to have a voice and participate. The creativity process is now depending on the public choice as they wander through the screen. It is the receiver who owns an object which can be exchanged. Virtual reality has encouraged the receiver to abandon its position of passive observer and to become a participant agent, which implies a challenge to organizations: inventing new forms of interfaces. Therefore, we intend to find new and effective online tools that can be used by cultural organizations; the best way to manage them; to show how organizations can create a community with the public and how the availability of online content and its archive can contribute to the organizations’ memory.
Resumo:
In this paper we consider a differentiated Stackelberg model, when the leader firm engages in an R&D process that gives an endogenous cost-reducing innovation. The aim is to study the licensing of the cost-reduction by a two-part tariff. By using comparative static analysis, we conclude that the degree of the differentiation of the goods plays an important role in the results. We also do a direct comparison between our model and Cournot duopoly model.
Resumo:
TLE in infancy has been the subject of varied research. Topographical and structural evidence is coincident with the neuronal systems responsible for auditory processing of the highest specialization and complexity. Recent studies have been showing the need of a hemispheric asymmetry for an optimization in central auditory processing (CAP) and acquisition and learning of a language system. A new functional research paradigm is required to study mental processes that require methods of cognitive-sensory information analysis processed in very short periods of time (msec), such as the ERPs. Thus, in this article, we hypothesize that the TLE in infancy could be a good model for topographic and functional study of CAP and its development process, contributing to a better understanding of the learning difficulties that children with this neurological disorder have.