30 resultados para Data portal performance
em Instituto Politécnico do Porto, Portugal
Resumo:
Num contexto de crescente complexidade e disponibilidade de informação, a gestão do capital intelectual assume cada vez mais preponderância como vantagem competitiva para as empresas que procuram maximizar o valor gerado. Esta investigação usa como metodologia príncipal o VAIC (coeficiente intelectual do valor adicionado), para assim estudar a existência de relação entre capital intelectual e a performance bolsista e financeira das empresas do PSI20. O VAIC é decomposto nos seus três indicadores de eficiência, tais como: capital humano, capital estrutural e capital físico. Os dados contemplam quinze empresas e nove anos de análise (2003 - 2011). Elaborou-se uma abordagem que recorre à utilização de técnicas econométricas para reduzir potênciais falhas no tratamento de dados em painel. Os resultados da análise demonstram uma relação positiva entre a aposta em capital intelectual a performance bolsista e financeira, ou seja, a utilização e gestão eficientes do capital intelectual contribuem de forma significativa na avaliação bolsista e financeira das empresas do PSI20.
Resumo:
Electric power networks, namely distribution networks, have been suffering several changes during the last years due to changes in the power systems operation, towards the implementation of smart grids. Several approaches to the operation of the resources have been introduced, as the case of demand response, making use of the new capabilities of the smart grids. In the initial levels of the smart grids implementation reduced amounts of data are generated, namely consumption data. The methodology proposed in the present paper makes use of demand response consumers’ performance evaluation methods to determine the expected consumption for a given consumer. Then, potential commercial losses are identified using monthly historic consumption data. Real consumption data is used in the case study to demonstrate the application of the proposed method.
Resumo:
There has been a growing interest in research on performance measurement and management practices, which seems to reflect researchers’ response to calls for the need to increase the relevance of management accounting research. However, despite the development of the new public management literature, studies involving public sector organizations are relatively small compared to those involving business organizations and extremely limited when it comes to public primary health care organizations. Yet, the economic significance of public health care organizations in the economy of developed countries and the criticisms these organizations regularly face from the public suggests there is a need for research. This is particularly true in the case of research that may lead to improvement in performance measurement and management practices and ultimately to improvements in the way health care organizations use their limited resources in the provision of services to the communities. This study reports on a field study involving three public primary health care organisations. The evidence obtained from interviews and archival data suggests a performance management practices in these institutions lacked consistency and coherence, potentially leading to decreased performance. Hierarchical controls seemed to be very weak and accountability limited, leading to a lack of direction, low motivation and, in some circumstances to insufficient managerial abilities and skills. Also, the performance management systems revealed a number of weaknesses, which suggests that there are various opportunities for improvement in performance in the studied organisations.
Resumo:
Dissertação para a obtenção do Grau de Mestre em Contabilidade e Finanças Orientador: Mestre Adalmiro Álvaro Malheiro de Castro Andrade Pereira
Resumo:
This paper describes a methodology that was developed for the classification of Medium Voltage (MV) electricity customers. Starting from a sample of data bases, resulting from a monitoring campaign, Data Mining (DM) techniques are used in order to discover a set of a MV consumer typical load profile and, therefore, to extract knowledge regarding to the electric energy consumption patterns. In first stage, it was applied several hierarchical clustering algorithms and compared the clustering performance among them using adequacy measures. In second stage, a classification model was developed in order to allow classifying new consumers in one of the obtained clusters that had resulted from the previously process. Finally, the interpretation of the discovered knowledge are presented and discussed.
Resumo:
A methodology based on data mining techniques to support the analysis of zonal prices in real transmission networks is proposed in this paper. The mentioned methodology uses clustering algorithms to group the buses in typical classes that include a set of buses with similar LMP values. Two different clustering algorithms have been used to determine the LMP clusters: the two-step and K-means algorithms. In order to evaluate the quality of the partition as well as the best performance algorithm adequacy measurements indices are used. The paper includes a case study using a Locational Marginal Prices (LMP) data base from the California ISO (CAISO) in order to identify zonal prices.
Residential property loans and performance during property price booms: evidence from European banks
Resumo:
Understanding the performance of banks is of the utmost relevance, because of the impact of this sector on economic growth and financial stability. Of all the different assets that make up a bank portfolio, the residential mortgage loans constitute one of its main. Using the dynamic panel data method, we analyse the influence of residential mortgage loans on bank profitability and risk, using a sample of 555 banks in the European Union (EU-15), over the period from 1995 to 2008. We find that banks with larger weights of residential mortgage loans show lower credit risk in good times. This result explains why banks rush to lend on property during booms due to the positive effects it has on credit risk. The results show further that credit risk and profitability are lower during the upturn in the residential property price cycle. The results also reveal the existence of a non-linear relationship (U-shaped marginal effect), as a function of bank’s risk, between profitability and the residential mortgage loans exposure. For those banks that have high credit risk, a large exposure of residential mortgage loans is associated with higher risk-adjusted profitability, through lower risk. For banks with a moderate/low credit risk, the effects of higher residential mortgage loan exposure on its risk-adjusted profitability are also positive or marginally positive.
Resumo:
Over time, XML markup language has acquired a considerable importance in applications development, standards definition and in the representation of large volumes of data, such as databases. Today, processing XML documents in a short period of time is a critical activity in a large range of applications, which imposes choosing the most appropriate mechanism to parse XML documents quickly and efficiently. When using a programming language for XML processing, such as Java, it becomes necessary to use effective mechanisms, e.g. APIs, which allow reading and processing of large documents in appropriated manners. This paper presents a performance study of the main existing Java APIs that deal with XML documents, in order to identify the most suitable one for processing large XML files
Resumo:
Over time, XML markup language has acquired a considerable importance in applications development, standards definition and in the representation of large volumes of data, such as databases. Today, processing XML documents in a short period of time is a critical activity in a large range of applications, which imposes choosing the most appropriate mechanism to parse XML documents quickly and efficiently. When using a programming language for XML processing, such as Java, it becomes necessary to use effective mechanisms, e.g. APIs, which allow reading and processing of large documents in appropriated manners. This paper presents a performance study of the main existing Java APIs that deal with XML documents, in order to identify the most suitable one for processing large XML files.
Resumo:
Mestrado em Engenharia Electrotécnica – Sistemas Eléctricos de Energia
Resumo:
A avaliação das organizações e a deterntinação da performance obtida pelo exercício da gestão, tem sido uma preocupação constante de gestores e accionistas, embora com objectivos diversos. Nos dias de hoje, a questão coloca-se com maior acuidade quer pela competitividade acrescida quer pela dimensão e complexidade actual das empresas. Pretendemos com este trabalho fazer uma descrição da metodologia DEA - Data Envelopment Analysis - nas suas formulações iniciais mais simples. A metodologia do DEA, pretende obter uma medida única e simples de avaliação da eficiência, combinando um conjunto de outputs e de inputs relativos às diferentes unidades homogéneas que se pretendem avaliar. O método DEA é um método não paramétrico que pelas suas características é particularmente adequado à avaliação de unidades homogéneas não necessariamente lucrativas. Concluímos, em geral, que são úteis e constituem um avanço importante, as informações obtidas através do DEA mas que outros métodos, designadamente rácios e análises de regressão, podem dar um contributo importante para complementar aquela análise.
Resumo:
Cluster scheduling and collision avoidance are crucial issues in large-scale cluster-tree Wireless Sensor Networks (WSNs). The paper presents a methodology that provides a Time Division Cluster Scheduling (TDCS) mechanism based on the cyclic extension of RCPS/TC (Resource Constrained Project Scheduling with Temporal Constraints) problem for a cluster-tree WSN, assuming bounded communication errors. The objective is to meet all end-to-end deadlines of a predefined set of time-bounded data flows while minimizing the energy consumption of the nodes by setting the TDCS period as long as possible. Sinceeach cluster is active only once during the period, the end-to-end delay of a given flow may span over several periods when there are the flows with opposite direction. The scheduling tool enables system designers to efficiently configure all required parameters of the IEEE 802.15.4/ZigBee beaconenabled cluster-tree WSNs in the network design time. The performance evaluation of thescheduling tool shows that the problems with dozens of nodes can be solved while using optimal solvers.
Resumo:
The IEEE 802.15.4 has been adopted as a communication protocol standard for Low-Rate Wireless Private Area Networks (LRWPANs). While it appears as a promising candidate solution for Wireless Sensor Networks (WSNs), its adequacy must be carefully evaluated. In this paper, we analyze the performance limits of the slotted CSMA/CA medium access control (MAC) mechanism in the beacon-enabled mode for broadcast transmissions in WSNs. The motivation for evaluating the beacon-enabled mode is due to its flexibility and potential for WSN applications as compared to the non-beacon enabled mode. Our analysis is based on an accurate simulation model of the slotted CSMA/CA mechanism on top of a realistic physical layer, with respect to the IEEE 802.15.4 standard specification. The performance of the slotted CSMA/CA is evaluated and analyzed for different network settings to understand the impact of the protocol attributes (superframe order, beacon order and backoff exponent), the number of nodes and the data frame size on the network performance, namely in terms of throughput (S), average delay (D) and probability of success (Ps). We also analytically evaluate the impact of the slotted CSMA/CA overheads on the saturation throughput. We introduce the concept of utility (U) as a combination of two or more metrics, to determine the best offered load range for an optimal behavior of the network. We show that the optimal network performance using slotted CSMA/CA occurs in the range of 35% to 60% with respect to an utility function proportional to the network throughput (S) divided by the average delay (D).
Resumo:
This work describes the impact of different teachers’ approaches in using Moodle, for supporting their courses, at the Polytechnic of Porto - School of Engineering. The study covers five different courses, from different degrees and different years, and includes a number of Moodle resources especially supporting laboratory classes. These and other active resources are particularly analyzed in order to evaluate students’ adherence to them. One particular course includes a number of remote experiments, made available through VISIR (Virtual Instrument Systems in Reality) and directly accessible through links included in the Moodle course page. The collected data have been correlated with students’ classifications in the lab component and in the exam, each one weighting 50% of their final marks. This analysis benefited from the existence of different teachers’ approaches, which resulted in a diversity of Moodle-supported environments. Conclusions point to the existence of a positive correlation factor between the number of Moodle accesses and the final exam grade, although the quality of the resources made available by the teachers seems to be preponderant over its quantity. In addition, different students perspectives were found regarding active resources: while some seem to encourage students to participate (for instance online quiz or online reports), others, more demanding, are unable to stimulate the majority of them.
Resumo:
This work extends a recent comparative study covering four different courses lectured at the Polytechnic of Porto - School of Engineering, in respect to the usage of a particular Learning Management System, i.e. Moodle, and its impact on students' results. A fifth course, which includes a number of resources especially supporting laboratory classes, is now added to the analysis. This particular course includes a number of remote experiments, made available through VISIR (Virtual Instrument Systems in Reality) and directly accessible through links included in the Moodle course page. We have analyzed the students' behavior in following these links and in effectively running experiments in VISIR (and also using other lab related resources, in Moodle). This data have been correlated with students' classifications in the lab component and in the exam, each one weighting 50% of their final marks. We aimed to compare students' performance in a richly Moodle-supported environment (with lab component) and in a poorly Moodle-supported environment (with only theoretical component). This question followed from conclusions drawn in the above referred comparative study, where it was shown that even though a positive correlation factor existed between the number of Moodle accesses and the final exam grade obtained by each student, its explanation behind was not straightforward, as the quality of the resources was preponderant over its quantity.