37 resultados para Data-driven knowledge acquisition


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Num sistema de ensino cada vez mais exigente, a experimentação assume um papel fundamental na aquisição e validação do conhecimento. No ensino da Física, a necessidade de compreender a influência do meio num dado conceito teórico leva a que a experimentação tenha um carácter obrigatório. Neste contexto, surgem três cenários capazes de suportar a aprendizagem dos conceitos teóricos adquiridos. A simulação que faz uso da velocidade e capacidades de cálculo do computador para obter o resultado de uma experiência, a experimentação tradicional em laboratório, na qual o aluno executa, presencialmente, a sua experiência e por último a experimentação remota, que permite a execução de uma experiência real sem a presença física do aluno. Esta dissertação apresenta o projeto de um aparato para experimentação remota do “Lançamento de projéteis”. De forma a providenciar um meio de ensino de Física mais flexível, o aparato desenvolvido permite, aos alunos, a determinação da aceleração da gravidade e o estudo da dependência do movimento de um projétil num conjunto de parâmetros. Este aparato, operado remotamente, é acedido via web, onde primeiramente é reservado um intervalo de tempo. O conjunto de parâmetros (“Bola”, “Altura de lançamento” e “Ângulo de lançamento”) da máquina permite suportar vários cenários de ensino da Física, com diferentes complexidades.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mestrado em Engenharia Informática - Área de Especialização em Sistemas Gráficos e Multimédia

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, a rule-based automatic syllabifier for Danish is described using the Maximal Onset Principle. Prior success rates of rule-based methods applied to Portuguese and Catalan syllabification modules were on the basis of this work. The system was implemented and tested using a very small set of rules. The results gave rise to 96.9% and 98.7% of word accuracy rate, contrary to our initial expectations, being Danish a language with a complex syllabic structure and thus difficult to be rule-driven. Comparison with data-driven syllabification system using artificial neural networks showed a higher accuracy rate of the former system.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Consider the problem of designing an algorithm for acquiring sensor readings. Consider specifically the problem of obtaining an approximate representation of sensor readings where (i) sensor readings originate from different sensor nodes, (ii) the number of sensor nodes is very large, (iii) all sensor nodes are deployed in a small area (dense network) and (iv) all sensor nodes communicate over a communication medium where at most one node can transmit at a time (a single broadcast domain). We present an efficient algorithm for this problem, and our novel algorithm has two desired properties: (i) it obtains an interpolation based on all sensor readings and (ii) it is scalable, that is, its time-complexity is independent of the number of sensor nodes. Achieving these two properties is possible thanks to the close interlinking of the information processing algorithm, the communication system and a model of the physical world.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Beyond the classical statistical approaches (determination of basic statistics, regression analysis, ANOVA, etc.) a new set of applications of different statistical techniques has increasingly gained relevance in the analysis, processing and interpretation of data concerning the characteristics of forest soils. This is possible to be seen in some of the recent publications in the context of Multivariate Statistics. These new methods require additional care that is not always included or refered in some approaches. In the particular case of geostatistical data applications it is necessary, besides to geo-reference all the data acquisition, to collect the samples in regular grids and in sufficient quantity so that the variograms can reflect the spatial distribution of soil properties in a representative manner. In the case of the great majority of Multivariate Statistics techniques (Principal Component Analysis, Correspondence Analysis, Cluster Analysis, etc.) despite the fact they do not require in most cases the assumption of normal distribution, they however need a proper and rigorous strategy for its utilization. In this work, some reflections about these methodologies and, in particular, about the main constraints that often occur during the information collecting process and about the various linking possibilities of these different techniques will be presented. At the end, illustrations of some particular cases of the applications of these statistical methods will also be presented.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Controlled fires in forest areas are frequently used in most Mediterranean countries as a preventive technique to avoid severe wildfires in summer season. In Portugal, this forest management method of fuel mass availability is also used and has shown to be beneficial as annual statistical reports confirm that the decrease of wildfires occurrence have a direct relationship with the controlled fire practice. However prescribed fire can have serious side effects in some forest soil properties. This work shows the changes that occurred in some forest soils properties after a prescribed fire action. The experiments were carried out in soil cover over a natural site of Andaluzitic schist, in Gramelas, Caminha, Portugal, that had not been burn for four years. The composed soil samples were collected from five plots at three different layers (0-3cm, 3-6cm and 6-18cm) during a three-year monitoring period after the prescribed burning. Principal Component Analysis was used to reach the presented conclusions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Adhesive bonding is nowadays a serious candidate to replace methods such as fastening or riveting, because of attractive mechanical properties. As a result, adhesives are being increasingly used in industries such as the automotive, aerospace and construction. Thus, it is highly important to predict the strength of bonded joints to assess the feasibility of joining during the fabrication process of components (e.g. due to complex geometries) or for repairing purposes. This work studies the tensile behaviour of adhesive joints between aluminium adherends considering different values of adherend thickness (h) and the double-cantilever beam (DCB) test. The experimental work consists of the definition of the tensile fracture toughness (GIC) for the different joint configurations. A conventional fracture characterization method was used, together with a J-integral approach, that take into account the plasticity effects occurring in the adhesive layer. An optical measurement method is used for the evaluation of crack tip opening and adherends rotation at the crack tip during the test, supported by a Matlab® sub-routine for the automated extraction of these quantities. As output of this work, a comparative evaluation between bonded systems with different values of adherend thickness is carried out and complete fracture data is provided in tension for the subsequent strength prediction of joints with identical conditions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Post-MAPS is a web platform that collects gastroenterological exam data from several european hospital centers, to be used in future clinical studies and was developed in partnership with experts from the gastroenterological area and information technology (IT) technicians. However, although functional, this platform has some issues that are crucial for its functioning, and can render user interaction unpleasant and exhaustive. Accordingly, we proposed the development of a new web platform, in which we aimed for an improvement in terms of usability, data uni cation and interoperability. Therefore, it was necessary to identify and study different ways of acquiring clinical data and review some of the existing clinical databases in order to understand how they work and what type of data they store, as well as their impact and contribution to clinical knowledge. Closely linked to the data model is the ability to share data with other systems, so, we also studied the concept of interoperability and analyzed some of the most widely used international standards, such as DICOM, HL7 and openEHR. As one of the primary objectives of this project was to achieve a better level of usability, practices related to Human Computer-Interaction, such as requirement analysis, creation of conceptual models, prototyping, and evaluation were also studied. Before we began the development, we conducted an analysis of the previous platform, from a functional point of view, which allowed us to gather not only a list of architectural and interface issues, but also a list of improvement opportunities. It was also performed a small preliminary study in order to evaluate the platform's usability, where we were able to realize that perceived usability is different between users, and that, in some aspects, varies according to their location, age and years of experience. Based on the information gathered during the platform's analysis and in the conclusions of the preliminary study, a new platform was developed, prepared for all potential users, from the inexperienced to the most comfortable with technology. It presents major improvements in terms of usability, also providing several new features that simplify the users' work, improving their interaction with the system, making their experience more enjoyable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Communities of Practice are places which provide a sound basis for organizational learning, enabling knowledge creation and acquisition thus improving organizational performance, leveraging innovation and consequently increasing competitively. Virtual Communities of Practice (VCoP‟s) can perform a central role in promoting communication and collaboration between members who are dispersed in both time and space. The ongoing case study, described here, aims to identify both the motivations and the constraints that members of an organization experience when taking part in the knowledge creating processes of the VCoP‟s to which they belong. Based on a literature review, we have identified several factors that influence such processes; they will be used to analyse the results of interviews carried out with the leaders of VCoP‟s in four multinationals. As future work, a questionnaire will be developed and administered to the other members of these VCoP‟s

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With accelerated market volatility, faster response times and increased globalization, business environments are going through a major transformation and firms have intensified their search for strategies which can give them competitive advantage. This requires that companies continuously innovate, to think of new ideas that can be transformed or implemented as products, processes or services, generating value for the firm. Innovative solutions and processes are usually developed by a group of people, working together. A grouping of people that share and create new knowledge can be considered as a Community of Practice (CoP). CoP’s are places which provide a sound basis for organizational learning and encourage knowledge creation and acquisition. Virtual Communities of Practice (VCoP's) can perform a central role in promoting communication and collaboration between members who are dispersed in both time and space. Nevertheless, it is known that not all CoP's and VCoP's share the same levels of performance or produce the same results. This means that there are factors that enable or constrain the process of knowledge creation. With this in mind, we developed a case study in order to identify both the motivations and the constraints that members of an organization experience when taking part in the knowledge creating processes of VCoP's. Results show that organizational culture and professional and personal development play an important role in these processes. No interviewee referred to direct financial rewards as a motivation factor for participation in VCoPs. Most identified the difficulty in aligning objectives established by the management with justification for the time spent in the VCoP. The interviewees also said that technology is not a constraint.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Business Intelligence (BI) is one emergent area of the Decision Support Systems (DSS) discipline. Over the last years, the evolution in this area has been considerable. Similarly, in the last years, there has been a huge growth and consolidation of the Data Mining (DM) field. DM is being used with success in BI systems, but a truly DM integration with BI is lacking. Therefore, a lack of an effective usage of DM in BI can be found in some BI systems. An architecture that pretends to conduct to an effective usage of DM in BI is presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

7th Mediterranean Conference on Information Systems, MCIS 2012, Guimaraes, Portugal, September 8-10, 2012, Proceedings Series: Lecture Notes in Business Information Processing, Vol. 129

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the electricity market liberalization, distribution and retail companies are looking for better market strategies based on adequate information upon the consumption patterns of its electricity customers. In this environment all consumers are free to choose their electricity supplier. A fair insight on the customer´s behaviour will permit the definition of specific contract aspects based on the different consumption patterns. In this paper Data Mining (DM) techniques are applied to electricity consumption data from a utility client’s database. To form the different customer´s classes, and find a set of representative consumption patterns, we have used the Two-Step algorithm which is a hierarchical clustering algorithm. Each consumer class will be represented by its load profile resulting from the clustering operation. Next, to characterize each consumer class a classification model will be constructed with the C5.0 classification algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes a methodology that was developed for the classification of Medium Voltage (MV) electricity customers. Starting from a sample of data bases, resulting from a monitoring campaign, Data Mining (DM) techniques are used in order to discover a set of a MV consumer typical load profile and, therefore, to extract knowledge regarding to the electric energy consumption patterns. In first stage, it was applied several hierarchical clustering algorithms and compared the clustering performance among them using adequacy measures. In second stage, a classification model was developed in order to allow classifying new consumers in one of the obtained clusters that had resulted from the previously process. Finally, the interpretation of the discovered knowledge are presented and discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent decades, all over the world, competition in the electric power sector has deeply changed the way this sector’s agents play their roles. In most countries, electric process deregulation was conducted in stages, beginning with the clients of higher voltage levels and with larger electricity consumption, and later extended to all electrical consumers. The sector liberalization and the operation of competitive electricity markets were expected to lower prices and improve quality of service, leading to greater consumer satisfaction. Transmission and distribution remain noncompetitive business areas, due to the large infrastructure investments required. However, the industry has yet to clearly establish the best business model for transmission in a competitive environment. After generation, the electricity needs to be delivered to the electrical system nodes where demand requires it, taking into consideration transmission constraints and electrical losses. If the amount of power flowing through a certain line is close to or surpasses the safety limits, then cheap but distant generation might have to be replaced by more expensive closer generation to reduce the exceeded power flows. In a congested area, the optimal price of electricity rises to the marginal cost of the local generation or to the level needed to ration demand to the amount of available electricity. Even without congestion, some power will be lost in the transmission system through heat dissipation, so prices reflect that it is more expensive to supply electricity at the far end of a heavily loaded line than close to an electric power generation. Locational marginal pricing (LMP), resulting from bidding competition, represents electrical and economical values at nodes or in areas that may provide economical indicator signals to the market agents. This article proposes a data-mining-based methodology that helps characterize zonal prices in real power transmission networks. To test our methodology, we used an LMP database from the California Independent System Operator for 2009 to identify economical zones. (CAISO is a nonprofit public benefit corporation charged with operating the majority of California’s high-voltage wholesale power grid.) To group the buses into typical classes that represent a set of buses with the approximate LMP value, we used two-step and k-means clustering algorithms. By analyzing the various LMP components, our goal was to extract knowledge to support the ISO in investment and network-expansion planning.