699 resultados para data warehouse tuning aggregato business intelligence performance
Constructing a raster-based spatio-temporal hierarchical data model for marine risheries application
Resumo:
Data warehouse / Data Mart. Arquiteturas. O processo ETT (Extração, Transformação e carga de dados).
Resumo:
Political drivers such as the Kyoto protocol, the EU Energy Performance of Buildings Directive and the Energy end use and Services Directive have been implemented in response to an identified need for a reduction in human related CO2 emissions. Buildings account for a significant portion of global CO2 emissions, approximately 25-30%, and it is widely acknowledged by industry and research organisations that they operate inefficiently. In parallel, unsatisfactory indoor environmental conditions have proven to negatively impact occupant productivity. Legislative drivers and client education are seen as the key motivating factors for an improvement in the holistic environmental and energy performance of a building. A symbiotic relationship exists between building indoor environmental conditions and building energy consumption. However traditional Building Management Systems and Energy Management Systems treat these separately. Conventional performance analysis compares building energy consumption with a previously recorded value or with the consumption of a similar building and does not recognise the fact that all buildings are unique. Therefore what is required is a new framework which incorporates performance comparison against a theoretical building specific ideal benchmark. Traditionally Energy Managers, who work at the operational level of organisations with respect to building performance, do not have access to ideal performance benchmark information and as a result cannot optimally operate buildings. This thesis systematically defines Holistic Environmental and Energy Management and specifies the Scenario Modelling Technique which in turn uses an ideal performance benchmark. The holistic technique uses quantified expressions of building performance and by doing so enables the profiled Energy Manager to visualise his actions and the downstream consequences of his actions in the context of overall building operation. The Ideal Building Framework facilitates the use of this technique by acting as a Building Life Cycle (BLC) data repository through which ideal building performance benchmarks are systematically structured and stored in parallel with actual performance data. The Ideal Building Framework utilises transformed data in the form of the Ideal Set of Performance Objectives and Metrics which are capable of defining the performance of any building at any stage of the BLC. It is proposed that the union of Scenario Models for an individual building would result in a building specific Combination of Performance Metrics which would in turn be stored in the BLC data repository. The Ideal Data Set underpins the Ideal Set of Performance Objectives and Metrics and is the set of measurements required to monitor the performance of the Ideal Building. A Model View describes the unique building specific data relevant to a particular project stakeholder. The energy management data and information exchange requirements that underlie a Model View implementation are detailed and incorporate traditional and proposed energy management. This thesis also specifies the Model View Methodology which complements the Ideal Building Framework. The developed Model View and Rule Set methodology process utilises stakeholder specific rule sets to define stakeholder pertinent environmental and energy performance data. This generic process further enables each stakeholder to define the resolution of data desired. For example, basic, intermediate or detailed. The Model View methodology is applicable for all project stakeholders, each requiring its own customised rule set. Two rule sets are defined in detail, the Energy Manager rule set and the LEED Accreditor rule set. This particular measurement generation process accompanied by defined View would filter and expedite data access for all stakeholders involved in building performance. Information presentation is critical for effective use of the data provided by the Ideal Building Framework and the Energy Management View definition. The specifications for a customised Information Delivery Tool account for the established profile of Energy Managers and best practice user interface design. Components of the developed tool could also be used by Facility Managers working at the tactical and strategic levels of organisations. Informed decision making is made possible through specified decision assistance processes which incorporate the Scenario Modelling and Benchmarking techniques, the Ideal Building Framework, the Energy Manager Model View, the Information Delivery Tool and the established profile of Energy Managers. The Model View and Rule Set Methodology is effectively demonstrated on an appropriate mixed use existing ‘green’ building, the Environmental Research Institute at University College Cork, using the Energy Management and LEED rule sets. Informed Decision Making is also demonstrated using a prototype scenario for the demonstration building.
Resumo:
PURPOSE: Review existing studies and provide new results on the development, regulatory, and market aspects of new oncology drug development. METHODS: We utilized data from the US Food and Drug Administration (FDA), company surveys, and publicly available commercial business intelligence databases on new oncology drugs approved in the United States and on investigational oncology drugs to estimate average development and regulatory approval times, clinical approval success rates, first-in-class status, and global market diffusion. RESULTS: We found that approved new oncology drugs to have a disproportionately high share of FDA priority review ratings, of orphan drug designations at approval, and of drugs that were granted inclusion in at least one of the FDA's expedited access programs. US regulatory approval times were shorter, on average, for oncology drugs (0.5 years), but US clinical development times were longer on average (1.5 years). Clinical approval success rates were similar for oncology and other drugs, but proportionately more of the oncology failures reached expensive late-stage clinical testing before being abandoned. In relation to other drugs, new oncology drug approvals were more often first-in-class and diffused more widely across important international markets. CONCLUSION: The market success of oncology drugs has induced a substantial amount of investment in oncology drug development in the last decade or so. However, given the great need for further progress, the extent to which efforts to develop new oncology drugs will grow depends on future public-sector investment in basic research, developments in translational medicine, and regulatory reforms that advance drug-development science.
Resumo:
This short position paper considers issues in developing Data Architecture for the Internet of Things (IoT) through the medium of an exemplar project, Domain Expertise Capture in Authoring and Development Environments (DECADE). A brief discussion sets the background for IoT, and the development of the distinction between things and computers. The paper makes a strong argument to avoid reinvention of the wheel, and to reuse approaches to distributed heterogeneous data architectures and the lessons learned from that work, and apply them to this situation. DECADE requires an autonomous recording system, local data storage, semi-autonomous verification model, sign-off mechanism, qualitative and quantitative analysis carried out when and where required through web-service architecture, based on ontology and analytic agents, with a self-maintaining ontology model. To develop this, we describe a web-service architecture, combining a distributed data warehouse, web services for analysis agents, ontology agents and a verification engine, with a centrally verified outcome database maintained by certifying body for qualification/professional status.
Resumo:
Cloud data centres are critical business infrastructures and the fastest growing service providers. Detecting anomalies in Cloud data centre operation is vital. Given the vast complexity of the data centre system software stack, applications and workloads, anomaly detection is a challenging endeavour. Current tools for detecting anomalies often use machine learning techniques, application instance behaviours or system metrics distribu- tion, which are complex to implement in Cloud computing environments as they require training, access to application-level data and complex processing. This paper presents LADT, a lightweight anomaly detection tool for Cloud data centres that uses rigorous correlation of system metrics, implemented by an efficient corre- lation algorithm without need for training or complex infrastructure set up. LADT is based on the hypothesis that, in an anomaly-free system, metrics from data centre host nodes and virtual machines (VMs) are strongly correlated. An anomaly is detected whenever correlation drops below a threshold value. We demonstrate and evaluate LADT using a Cloud environment, where it shows that the hosting node I/O operations per second (IOPS) are strongly correlated with the aggregated virtual machine IOPS, but this correlation vanishes when an application stresses the disk, indicating a node-level anomaly.
Resumo:
Approximately half of the houses in Northern Ireland were built before any form of minimum thermal specification (U-value) or energy efficiency standard were available. At present, 44% of households are categorised as being in fuel poverty; spending more than 10% of the household income to heat the house to an acceptable level. This paper presents the results from long term performance monitoring of 4 case study houses that have undergone retrofits to improve energy efficiency in Northern Ireland. There is some uncertainty associated with some of the marketed retrofit measures in terms of their effectiveness in reducing energy usage and their potential to cause detrimental impacts on the internal environment of a house. Using wireless sensor technology internal conditions such as temperature and humidity were measured alongside gas and electricity usage for a year. External weather conditions were also monitored. The paper considers the effectiveness of the different retrofit measures implemented based on the long term data monitoring and short term building performance evaluation tests that were completed.
Resumo:
Im Rahmen der Globalisierung und des daraus resultierenden Wettbewerbs ist es für ein Unternehmen von zentraler Bedeutung, Wissen über die Wettbewerbssituation zu erhalten. Nicht nur zur Erschließung neuer Märkte, sondern auch zur Sicherung der Unternehmensexistenz ist eine Wettbewerbsanalyse unabdingbar. Konkurrenz- bzw. Wettbewerbsforschung wird überwiegend als „Competitive Intelligence“ bezeichnet. In diesem Sinne beschäftigt sich die vorliegende Bachelorarbeit mit einem Bereich von Competitive Intelligence. Nach der theoretischen Einführung in das Thema werden die Ergebnisse von neun Experteninterviews sowie einer schriftlichen Expertenbefragung innerhalb des Unternehmens erläutert. Die Experteninterviews und -befragungen zum Thema Competitive Intelligence dienten zur Entwicklung eines neuen Wettbewerbsanalysekonzeptes. Die Experteninterviews zeigten, dass in dem Unternehmen kein einheitliches Wettbewerbsanalysesystem existiert und Analysen lediglich ab hoc getätigt werden. Zusätzlich wird ein Länderranking vorgestellt, das zur Analyse europäischer Länder für das Unternehmen entwickelt wurde. Die Ergebnisse zeigten, dass Dänemark und Italien für eine Ausweitung der Exportgeschäfte bedeutend sind. Der neu entwickelte Mitbewerberbewertungsbogen wurde auf Grundlage dieser Ergebnisse für Dänemark und Italien getestet.
Resumo:
Die nachhaltige Verschiebung der Wachstumsmärkte in Richtung Emerging Markets (und hier insbesondere in die BRIC-Staaten) infolge der Wirtschaftskrise 2008/2009 hat die bereits weit reichend konsolidierte Nutzfahrzeugindustrie der Triadenmärkte in Nordamerika, Europa und Japan vor eine Vielzahl von Herausforderungen gestellt. Strategische Ziele wie die Festigung und Steigerung von Absatzvolumina sowie eine bessere Ausbalancierung von zyklischen Marktentwicklungen, die die Ertragssicherung und eine weitestgehend kontinuierliche Auslastung existenter Kapazitäten sicherstellen soll, sind in Zukunft ohne eine Marktbearbeitung in den ex-Triade Wachstumsmärkten kaum noch erreichbar. Dies verlangt eine Auseinandersetzung der betroffenen Unternehmen mit dem veränderten unternehmerischen Umfeld. Es gilt neue, bisher größtenteils unbekannte Märkte zu erobern und sich dabei neuen – teilweise ebenfalls wenig bekannten - Wettbewerbern und deren teilweise durchaus unkonventionellen Strategien zu stellen. Die Triade-Unternehmen sehen sich dabei Informationsdefiziten und einer zunehmenden Gesamtkomplexität ausgesetzt, die zu für sie zu nachteiligen und ungünstigen nformationsasymmetrien führen können. Die Auswirkungen, dieser Situation unangepasst gegenüberzutreten wären deutlich unsicherheits- und risikobehaftetere Marktbearbeitungsstrategien bzw. im Extremfall die Absenz von Internationalisierungsaktivitäten in den betroffenen Unternehmen. Die Competitive Intelligence als Instrument zur unternehmerischen Umfeldanalyse kann unterstützen diese negativen Informationsasymmetrien zu beseitigen aber auch für das Unternehmen günstige Informationsasymmetrien in Form von Informationsvorsprüngen generieren, aus denen sich Wettbewerbsvorteile ableiten lassen. Dieser Kontext Competitive Intelligence zur Beseitigung von Informationsdefiziten bzw. Schaffung von bewussten, opportunistischen Informationsasymmetrien zur erfolgreichen Expansion durch Internationalisierungsstrategien in den Emerging Markets wird im Rahmen dieses Arbeitspapieres durch die Verbindung von wissenschaftstheoretischen und praktischen Implikationen näher beleuchtet. Die sich aus dem beschriebenen praktischen Anwendungsbeispiel Competitive intelligence für afrikanische Marktbearbeitung ergebenden Erkenntnisse der erfolgreichen Anwendung von Competitive Intelligence als Entscheidungshilfe für Internationalisierungsstrategien sind wie folgt angelegt: - Erweiterung der Status-quo, häufig Stammmarkt-zentristisch angelegten Betrachtungsweisen von Märkten und Wettbewerbern in Hinblick auf das reale Marktgeschehen oder Potentialmärkte - bias-freie Clusterung von Märkten bzw. Wettbewerbern, oder Verzicht auf den Versuch der Simplifizierung durch Clusterbildung - differenzierte Datenerhebungsverfahren wie lokale vs. zentrale / primäre vs. sekundäre Datenerhebung für inhomogene, unterentwickelte oder sich entwickelnde Märkte - Identifizierung und Hinzuziehung von Experten mit dem entscheidenden Wissensvorsprung für den zu bearbeitenden Informationsbedarf - Überprüfung der Informationen durch Datentriangulation
Resumo:
The RHPP policy provided subsidies for private householders, Registered social landlords and communities to install renewable heat measures in residential properties. Eligible measures included air and ground-source heat pumps, biomass boilers and solar thermal. Around 18,000 heat pumps were installed via this scheme. DECC funded a detailed monitoring campaign, which covered 700 heat pumps (around 4% of the total). The aim of this monitoring campaign was to assess the efficiencies of the heat pumps and to estimate the carbon and bill savings and amount of renewable heat generated. Data was collected from 31/10/2013 to 31/03/2015. This report represents the analysis of this data and represents the most complete and reliable data in-situ residential heat pump performance in the UK to date.
Resumo:
Dissertação para a obtenção do Grau de Mestre em Contabilidade e Finanças Orientador: Mestre Adalmiro Álvaro Malheiro de Castro Andrade Pereira
Resumo:
Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Informática e de Computadores
Resumo:
Aim of the paper: The purpose of this paper is to examine human resources management practices (HRM practices) in small firms and to improve the understanding of the relationship between this kind of practices and business growth. This exploratory study is based on the resource-based view of the firm and empirical work carried out in two small firms by relating HRM practices with the firms’ results. Contribution to the literature: This is an in-depth study of HRM practices and its impact on performance growth in micro firms, isolating and controlling for most of the contextual and internal variables considered in the literature that relate HRM to growth. Firm growth analysis was broadened by the use of several dependent variables: employment growth and operational and financial performance growth. Some hypotheses for further research in identifying HRM practices in small business and its relation with firm growth are suggested. Methodology: Case study methodology was used to study two firms. The techniques used to collect data were semi-structured interviews to the owner and all the employees, unstructured observation at the firms’ facilities (during two days), entrepreneur profile definition (survey answer) and document data collection (on demographic characterization and performance results). Data was analyzed through content analysis methodology, and categories derived from the interviews’ protocols and literature. Results and implications: Results revealed that despite the firms’ organizational characteristics similarities, they differ significantly in owners’ motivation to grow, HRM practices and organizational performance and growth. Future studies should pay special attention to owner willingness to grow, to firms’ years of experience in business, to staff’s years of experience in their field of work and turnover. HRM practices in micro/small firms should be better defined and characterized. The external image of management posture relating to longitudinal financial results and growth should also be explored.
Resumo:
A tese desenvolvida tem como foco fornecer os meios necessários para extrair conhecimento contidos no histórico académico da instituição transformando a informação em algo simples e de fácil leitura para qualquer utilizador. Com o progresso da sociedade, as escolas recebem milhares de alunos todos os anos que terão de ser orientados e monitorizados pelos dirigentes das instituições académicas de forma a garantir programas eficientes e adequados para o progresso educacional de todos os alunos. Atribuir a um docente a responsabilidade de actuar segundo o historial académico dos seus alunos não é plausível uma vez que um aluno consegue produzir milhares de registos para análise. O paradigma de mineração de dados na educação surge com a necessidade de otimizar os recursos disponíveis expondo conclusões que não se encontram visiveis sem uma análise acentuada e cuidada. Este paradigma expõe de forma clara e sucinta os dados estatísticos analisados por computador oferecendo a possibilidade de melhorar as lacunas na qualidade de ensino das instituições. Esta dissertação detalha o desenvolvimento de uma ferramente de inteligência de negócio capaz de, através de mineração de dados, analisar e apresentar conclusões pertinentes de forma legível ao utilizador.
Resumo:
Dissertação apresentada como requisito parcial para obtenção do grau de Mestre em Estatística e Gestão de Informação