921 resultados para Market of resources


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Power systems are planed and operated according to the optimization of the available resources. Traditionally these tasks were mostly undertaken in a centralized way which is no longer adequate in a competitive environment. Demand response can play a very relevant role in this context but adequate tools to negotiate this kind of resources are required. This paper presents an approach to deal with these issues, by using a multi-agent simulator able to model demand side players and simulate their strategic behavior. The paper includes an illustrative case study that considers an incident situation. The distribution company is able to reduce load curtailment due to load flexibility contracts previously established with demand side players.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Energy Resources Management can play a very relevant role in future power systems in SmartGrid context, with high penetration of distributed generation and storage systems. This paper deals with the importance of resources management in incident situation. The system to consider a high penetration of distributed generation, demand response, storage units and network reconfiguration. A case study evidences the advantages of using a flexible SCADA to control the energy resources in incident situation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the last years there has been a considerable increase in the number of people in need of intensive care, especially among the elderly, a phenomenon that is related to population ageing (Brown 2003). However, this is not exclusive of the elderly, as diseases as obesity, diabetes, and blood pressure have been increasing among young adults (Ford and Capewell 2007). As a new fact, it has to be dealt with by the healthcare sector, and particularly by the public one. Thus, the importance of finding new and cost effective ways for healthcare delivery are of particular importance, especially when the patients are not to be detached from their environments (WHO 2004). Following this line of thinking, a VirtualECare Multiagent System is presented in section 2, being our efforts centered on its Group Decision modules (Costa, Neves et al. 2007) (Camarinha-Matos and Afsarmanesh 2001).On the other hand, there has been a growing interest in combining the technological advances in the information society - computing, telecommunications and knowledge – in order to create new methodologies for problem solving, namely those that convey on Group Decision Support Systems (GDSS), based on agent perception. Indeed, the new economy, along with increased competition in today’s complex business environments, takes the companies to seek complementarities, in order to increase competitiveness and reduce risks. Under these scenarios, planning takes a major role in a company life cycle. However, effective planning depends on the generation and analysis of ideas (innovative or not) and, as a result, the idea generation and management processes are crucial. Our objective is to apply the GDSS referred to above to a new area. We believe that the use of GDSS in the healthcare arena will allow professionals to achieve better results in the analysis of one’s Electronically Clinical Profile (ECP). This attainment is vital, regarding the incoming to the market of new drugs and medical practices, which compete in the use of limited resources.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVE: To identify the effects of decentralization on health financing and governance policies in Mexico from the perspective of users and providers. METHODS: A cross-sectional study was carried out in four states that were selected according to geopolitical and administrative criteria. Four indicators were assessed: changes and effects on governance, financing sources and funds, the final destination of resources, and fund allocation mechanisms. Data collection was performed using in-depth interviews with health system key personnel and community leaders, consensus techniques and document analyses. The interviews were transcribed and analyzed by thematic segmentation. RESULTS: The results show different effectiveness levels for the four states regarding changes in financing policies and community participation. Effects on health financing after decentralization were identified in each state, including: greater participation of municipal and state governments in health expenditure, increased financial participation of households, greater community participation in low-income states, duality and confusion in the new mechanisms for coordination among the three government levels, absence of an accountability system, lack of human resources and technical skills to implement, monitor and evaluate changes in financing. CONCLUSIONS: In general, positive and negative effects of decentralization on health financing and governance were identified. The effects mentioned by health service providers and users were related to a diversification of financing sources, a greater margin for decisions around the use and final destination of financial resources and normative development for the use of resources. At the community level, direct financial contributions were mentioned, as well as in-kind contributions, particularly in the form of community work.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Mestrado em Engenharia Electrotécnica e de Computadores

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Nanotechnology is an important emerging industry with a projected annual market of around one trillion dollars by 2015. It involves the control of atoms and molecules to create new materials with a variety of useful functions. Although there are advantages on the utilization of these nano-scale materials, questions related with its impact over the environment and human health must be addressed too, so that potential risks can be limited at early stages of development. At this time, occupational health risks associated with manufacturing and use of nanoparticles are not yet clearly understood. However, workers may be exposed to nanoparticles through inhalation at levels that can greatly exceed ambient concentrations. Current workplace exposure limits are based on particle mass, but this criteria could not be adequate in this case as nanoparticles are characterized by very large surface area, which has been pointed out as the distinctive characteristic that could even turn out an inert substance into another substance exhibiting very different interactions with biological fluids and cells. Therefore, it seems that, when assessing human exposure based on the mass concentration of particles, which is widely adopted for particles over 1 μm, would not work in this particular case. In fact, nanoparticles have far more surface area for the equivalent mass of larger particles, which increases the chance they may react with body tissues. Thus, it has been claimed that surface area should be used for nanoparticle exposure and dosing. As a result, assessing exposure based on the measurement of particle surface area is of increasing interest. It is well known that lung deposition is the most efficient way for airborne particles to enter the body and cause adverse health effects. If nanoparticles can deposit in the lung and remain there, have an active surface chemistry and interact with the body, then, there is potential for exposure. It was showed that surface area plays an important role in the toxicity of nanoparticles and this is the metric that best correlates with particle-induced adverse health effects. The potential for adverse health effects seems to be directly proportional to particle surface area. The objective of the study is to identify and validate methods and tools for measuring nanoparticles during production, manipulation and use of nanomaterials.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Dissertação para a obtenção do Grau de Mestre em Contabilidade e Finanças Orientador: Mestre Adalmiro Álvaro Malheiro de Castro Andrade Pereira

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Today all kinds of innovations and research work is done by partnerships of competent entities each having some specialized skills. Like the development of the global economy, global innovation partnerships have grown considerably and form the basis of most of the sophisticated innovations today. To further streamline and simplify such cooperation, several innovation networks have been formed, both at local and global levels. This paper discusses the different types of innovations and how cooperation can benefit innovation in terms of pooling of resources and sharing of risks. One example of an open global co-innovation network promoted by Tata Consultancy Services, the TCS COIN is taken as a case. It enables venture capitalists, consultants, research agencies, companies and universities form nodes of the network so that each entity can play a meaningful role in the innovation network. Further, two innovation projects implemented using the COIN are discussed. Innovation Networks like these could form the basis of a unique global innovation network, which is not owned by any company and is used by innovation partners globally to collaborate and conduct research and development.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Debugging electronic circuits is traditionally done with bench equipment directly connected to the circuit under debug. In the digital domain, the difficulties associated with the direct physical access to circuit nodes led to the inclusion of resources providing support to that activity, first at the printed circuit level, and then at the integrated circuit level. The experience acquired with those solutions led to the emergence of dedicated infrastructures for debugging cores at the system-on-chip level. However, all these developments had a small impact in the analog and mixed-signal domain, where debugging still depends, to a large extent, on direct physical access to circuit nodes. As a consequence, when analog and mixed-signal circuits are integrated as cores inside a system-on-chip, the difficulties associated with debugging increase, which cause the time-to-market and the prototype verification costs to also increase. The present work considers the IEEE1149.4 infrastructure as a means to support the debugging of mixed-signal circuits, namely to access the circuit nodes and also an embedded debug mechanism named mixed-signal condition detector, necessary for watch-/breakpoints and real-time analysis operations. One of the main advantages associated with the proposed solution is the seamless migration to the system-on-chip level, as the access is done through electronic means, thus easing debugging operations at different hierarchical levels.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, a novel hybrid approach is proposed for electricity prices forecasting in a competitive market, considering a time horizon of 1 week. The proposed approach is based on the combination of particle swarm optimization and adaptive-network based fuzzy inference system. Results from a case study based on the electricity market of mainland Spain are presented. A thorough comparison is carried out, taking into account the results of previous publications, to demonstrate its effectiveness regarding forecasting accuracy and computation time. Finally, conclusions are duly drawn.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Dissertação apresentada ao Instituto Politécnico do Porto para obtenção do Grau de Mestre em Gestão das Organizações, Ramo de Gestão de Empresas Orientada pelo Professor Doutor José Freitas Santos

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The teaching-learning process is increasingly focused on the combination of the paradigms “learning by viewing” and “learning by doing.” In this context, educational resources, either expository or evaluative, play a pivotal role. Both types of resources are interdependent and their sequencing would create a richer educational experience to the end user. However, there is a lack of tools that support sequencing essentially due to the fact that existing specifications are complex. The Seqins is a sequencing tool of digital resources that has a fairly simple sequencing model. The tool communicates through the IMS LTI specification with a plethora of e-learning systems such as learning management systems, repositories, authoring and evaluation systems. In order to validate Seqins we integrate it in an e-learning Ensemble framework instance for the computer programming learning.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Motivations/barriers to participate in ITF

Relevância:

90.00% 90.00%

Publicador:

Resumo:

To meet the increasing demands of the complex inter-organizational processes and the demand for continuous innovation and internationalization, it is evident that new forms of organisation are being adopted, fostering more intensive collaboration processes and sharing of resources, in what can be called collaborative networks (Camarinha-Matos, 2006:03). Information and knowledge are crucial resources in collaborative networks, being their management fundamental processes to optimize. Knowledge organisation and collaboration systems are thus important instruments for the success of collaborative networks of organisations having been researched in the last decade in the areas of computer science, information science, management sciences, terminology and linguistics. Nevertheless, research in this area didn’t give much attention to multilingual contexts of collaboration, which pose specific and challenging problems. It is then clear that access to and representation of knowledge will happen more and more on a multilingual setting which implies the overcoming of difficulties inherent to the presence of multiple languages, through the use of processes like localization of ontologies. Although localization, like other processes that involve multilingualism, is a rather well-developed practice and its methodologies and tools fruitfully employed by the language industry in the development and adaptation of multilingual content, it has not yet been sufficiently explored as an element of support to the development of knowledge representations - in particular ontologies - expressed in more than one language. Multilingual knowledge representation is then an open research area calling for cross-contributions from knowledge engineering, terminology, ontology engineering, cognitive sciences, computational linguistics, natural language processing, and management sciences. This workshop joined researchers interested in multilingual knowledge representation, in a multidisciplinary environment to debate the possibilities of cross-fertilization between knowledge engineering, terminology, ontology engineering, cognitive sciences, computational linguistics, natural language processing, and management sciences applied to contexts where multilingualism continuously creates new and demanding challenges to current knowledge representation methods and techniques. In this workshop six papers dealing with different approaches to multilingual knowledge representation are presented, most of them describing tools, approaches and results obtained in the development of ongoing projects. In the first case, Andrés Domínguez Burgos, Koen Kerremansa and Rita Temmerman present a software module that is part of a workbench for terminological and ontological mining, Termontospider, a wiki crawler that aims at optimally traverse Wikipedia in search of domainspecific texts for extracting terminological and ontological information. The crawler is part of a tool suite for automatically developing multilingual termontological databases, i.e. ontologicallyunderpinned multilingual terminological databases. In this paper the authors describe the basic principles behind the crawler and summarized the research setting in which the tool is currently tested. In the second paper, Fumiko Kano presents a work comparing four feature-based similarity measures derived from cognitive sciences. The purpose of the comparative analysis presented by the author is to verify the potentially most effective model that can be applied for mapping independent ontologies in a culturally influenced domain. For that, datasets based on standardized pre-defined feature dimensions and values, which are obtainable from the UNESCO Institute for Statistics (UIS) have been used for the comparative analysis of the similarity measures. The purpose of the comparison is to verify the similarity measures based on the objectively developed datasets. According to the author the results demonstrate that the Bayesian Model of Generalization provides for the most effective cognitive model for identifying the most similar corresponding concepts existing for a targeted socio-cultural community. In another presentation, Thierry Declerck, Hans-Ulrich Krieger and Dagmar Gromann present an ongoing work and propose an approach to automatic extraction of information from multilingual financial Web resources, to provide candidate terms for building ontology elements or instances of ontology concepts. The authors present a complementary approach to the direct localization/translation of ontology labels, by acquiring terminologies through the access and harvesting of multilingual Web presences of structured information providers in the field of finance, leading to both the detection of candidate terms in various multilingual sources in the financial domain that can be used not only as labels of ontology classes and properties but also for the possible generation of (multilingual) domain ontologies themselves. In the next paper, Manuel Silva, António Lucas Soares and Rute Costa claim that despite the availability of tools, resources and techniques aimed at the construction of ontological artifacts, developing a shared conceptualization of a given reality still raises questions about the principles and methods that support the initial phases of conceptualization. These questions become, according to the authors, more complex when the conceptualization occurs in a multilingual setting. To tackle these issues the authors present a collaborative platform – conceptME - where terminological and knowledge representation processes support domain experts throughout a conceptualization framework, allowing the inclusion of multilingual data as a way to promote knowledge sharing and enhance conceptualization and support a multilingual ontology specification. In another presentation Frieda Steurs and Hendrik J. Kockaert present us TermWise, a large project dealing with legal terminology and phraseology for the Belgian public services, i.e. the translation office of the ministry of justice, a project which aims at developing an advanced tool including expert knowledge in the algorithms that extract specialized language from textual data (legal documents) and whose outcome is a knowledge database including Dutch/French equivalents for legal concepts, enriched with the phraseology related to the terms under discussion. Finally, Deborah Grbac, Luca Losito, Andrea Sada and Paolo Sirito report on the preliminary results of a pilot project currently ongoing at UCSC Central Library, where they propose to adapt to subject librarians, employed in large and multilingual Academic Institutions, the model used by translators working within European Union Institutions. The authors are using User Experience (UX) Analysis in order to provide subject librarians with a visual support, by means of “ontology tables” depicting conceptual linking and connections of words with concepts presented according to their semantic and linguistic meaning. The organizers hope that the selection of papers presented here will be of interest to a broad audience, and will be a starting point for further discussion and cooperation.