916 resultados para Data Mining and its Application
Resumo:
Species composition and abundance of phytoplankton and chlorophyll concentration were measured at three horizons of 9 stations in the Nha Trang Bay of the South China Sea in March 1998. Vertical distribution of fluorescence parameters, temperature and irradiance were measured in the 0-18 m layer of the water column at 21 stations. It was shown that according to biomass (B) and chlorophyll concentration (Chl) the Bay is mezotrophic. B and Chl in the water column increased seaward. Mean values of Chl in the southern part of the Bay exceeded those in northern part. Mean values of B were similar. B and Chl in the bottom layer exceeded ones in the upper layer. Diatoms dominated in species diversity and abundance. Diatom Guinardia striata made the main contribution to phytoplankton biomass. Similarity of phytoplankton was high. In the upper layer phytoplankton was photoinhibited during the most part of the light period, but at the bottom photosynthetic activity was high. Water column B varied in an order of magnitude during the daily cycle mainly because of B variations in the bottom layer due to tide flow.
Resumo:
In this paper authors present and discuss data on distribution and mineral composition of suspended particulate matter (SPM) in the Franz Victoria Trough, collected during Cruise 14 of scientific icebreaker Akademik Fedorov in the northern Barents Sea in October 1998. Higher total SPM concentrations (0.4-1.8 mg/l) were measured in the near-bottom layer of the Franz Victoria Strait and central part of the trough. Potential source of mineral particles in SPM is fine fractions of Barents Sea bottom sediments. They form the nepheloid layer, which spreads on the continental slope along the trough together with Barents Sea waters at 350-400 m depth.
Resumo:
The formation of a subsurface anticyclonic eddy in the Peru-Chile Undercurrent (PCUC) in January and February 2013 is investigated using a multi-platform four-dimensional observational approach. Research vessel, multiple glider and mooring-based measurements were conducted in the Peruvian upwelling regime near 12°30'S. The dataset consists of more than 10000 glider profiles and repeated vessel-based hydrography and velocity transects. It allows a detailed description of the eddy formation and its impact on the near-coastal salinity, oxygen and nutrient distributions. In early January, a strong PCUC with maximum poleward velocities of ca. 0.25 m/s at 100 to 200 m depth was observed. Starting on January 20 a subsurface anticyclonic eddy developed in the PCUC downstream of a topographic bend, suggesting flow separation as the eddy formation mechanism. The eddy core waters exhibited oxygen concentrations less than 1mol/kg, an elevated nitrogen-deficit of ca. 17µmol/l and potential vorticity close to zero, which seemed to originate from the bottom boundary layer of the continental slope. The eddy-induced across-shelf velocities resulted in an elevated exchange of water masses between the upper continental slope and the open ocean. Small scale salinity and oxygen structures were formed by along-isopycnal stirring and indications of eddy-driven oxygen ventilation of the upper oxygen minimum zone were observed. It is concluded that mesoscale stirring of solutes and the offshore transport of eddy core properties could provide an important coastal open-ocean exchange mechanism with potentially large implications for nutrient budgets and biogeochemical cycling in the oxygen minimum zone off Peru.
Resumo:
The Spanish National Library (Biblioteca Nacional de España1. BNE) and the Ontology Engineering Group2 of Universidad Politécnica de Madrid are working on the joint project ?Preliminary Study of Linked Data?, whose aim is to enrich the Web of Data with the BNE authority and bibliographic records. To this end, they are transforming the BNE information to RDF following the Linked Data principles3 proposed by Tim Berners Lee.
Resumo:
The aim of program specialization is to optimize programs by exploiting certain knowledge about the context in which the program will execute. There exist many program manipulation techniques which allow specializing the program in different ways. Among them, one of the best known techniques is partial evaluation, often referred to simply as program specialization, which optimizes programs by specializing them for (partially) known input data. In this work we describe abstract specialization, a technique whose main features are: (1) specialization is performed with respect to "abstract" valúes rather than "concrete" ones, and (2) abstract interpretation rather than standard interpretation of the program is used in order to propágate information about execution states. The concept of abstract specialization is at the heart of the specialization system in CiaoPP, the Ciao system preprocessor. In this paper we present a unifying view of the different specialization techniques used in CiaoPP and discuss their potential applications by means of examples. The applications discussed include program parallelization, optimization of dynamic scheduling (concurreney), and integration of partial evaluation techniques.
Resumo:
Ciao Prolog incorporates a module system which allows sepárate compilation and sensible creation of standalone executables. We describe some of the main aspects of the Ciao modular compiler, ciaoc, which takes advantage of the characteristics of the Ciao Prolog module system to automatically perform sepárate and incremental compilation and efficiently build small, standalone executables with competitive run-time performance, ciaoc can also detect statically a larger number of programming errors. We also present a generic code processing library for handling modular programs, which provides an important part of the functionality of ciaoc. This library allows the development of program analysis and transformation tools in a way that is to some extent orthogonal to the details of module system design, and has been used in the implementation of ciaoc and other Ciao system tools. We also describe the different types of executables which can be generated by the Ciao compiler, which offer different tradeoffs between executable size, startup time, and portability, depending, among other factors, on the linking regime used (static, dynamic, lazy, etc.). Finally, we provide experimental data which illustrate these tradeoffs.
Resumo:
Abstract Due to recent scientific and technological advances in information sys¬tems, it is now possible to perform almost every application on a mobile device. The need to make sense of such devices more intelligent opens an opportunity to design data mining algorithm that are able to autonomous execute in local devices to provide the device with knowledge. The problem behind autonomous mining deals with the proper configuration of the algorithm to produce the most appropriate results. Contextual information together with resource information of the device have a strong impact on both the feasibility of a particu¬lar execution and on the production of the proper patterns. On the other hand, performance of the algorithm expressed in terms of efficacy and efficiency highly depends on the features of the dataset to be analyzed together with values of the parameters of a particular implementation of an algorithm. However, few existing approaches deal with autonomous configuration of data mining algorithms and in any case they do not deal with contextual or resources information. Both issues are of particular significance, in particular for social net¬works application. In fact, the widespread use of social networks and consequently the amount of information shared have made the need of modeling context in social application a priority. Also the resource consumption has a crucial role in such platforms as the users are using social networks mainly on their mobile devices. This PhD thesis addresses the aforementioned open issues, focusing on i) Analyzing the behavior of algorithms, ii) mapping contextual and resources information to find the most appropriate configuration iii) applying the model for the case of a social recommender. Four main contributions are presented: - The EE-Model: is able to predict the behavior of a data mining algorithm in terms of resource consumed and accuracy of the mining model it will obtain. - The SC-Mapper: maps a situation defined by the context and resource state to a data mining configuration. - SOMAR: is a social activity (event and informal ongoings) recommender for mobile devices. - D-SOMAR: is an evolution of SOMAR which incorporates the configurator in order to provide updated recommendations. Finally, the experimental validation of the proposed contributions using synthetic and real datasets allows us to achieve the objectives and answer the research questions proposed for this dissertation.
Resumo:
The deviation of calibration coefficients from five cup anemometer models over time was analyzed. The analysis was based on a series of laboratory calibrations between January 2001 and August 2010. The analysis was performed on two different groups of anemometers: (1) anemometers not used for any industrial purpose (that is, just stored); and (2) anemometers used in different industrial applications (mainly in the field—or outside—applications like wind farms). Results indicate a loss of performance of the studied anemometers over time. In the case of the unused anemometers the degradation shows a clear pattern. In the case of the anemometers used in the field, the data analyzed also suggest a loss of performance, yet the degradation does not show a clear trend. A recalibration schedule is proposed based on the observed performances variations
Resumo:
Fourier transform infrared (FTIR) spectroscopy was applied to determine the type of surface treatment and dose used on cork stoppers and to predict the friction between stopper and bottleneck. Agglomerated cork stoppers were finished with two different doses and using two surface treatments: P (paraffin and silicone), 15 and 25 mg/stopper, and S (only silicone), 10 and 15 mg/stopper. FTIR spectra were recorded at five points for each stopper by attenuated total reflectance (ATR). Absorbances at 1,010, 2,916, and 2,963 cm -1 were obtained in each spectrum. Discriminant analysis techniques allowed the treatment, and dose applied to each stopper to be identified from the absorbance values. 91.2% success rates were obtained from individual values and 96.0% from the mean values of each stopper. Spectrometric data also allowed treatment homogeneity to be determined on the stopper surface, and a multiple regression model was used to predict the friction index (If = Fe/Fc) (R 2 = 0.93)
Resumo:
The optoelectronic properties of Cu2ZnSnS4 and environmental considerations have attracted significant interest for photovoltaics. Using first-principles, we analyze the possible improvement of this material as a photovoltaic absorber via the isoelectronic substitution of S with O atoms. The evolution of the acceptor level is analyzed with respect to the atomic position of the nearest neighbors of the O atom. We estimate the maximum efficiency of this compound when used as a light absorber. The presence of the sub-band gap level below the conduction band could increases the solar-energy conversion with respect to the host.
Resumo:
Acquired brain injury (ABI) is one of the leading causes of death and disability in the world and is associated with high health care costs as a result of the acute treatment and long term rehabilitation involved. Different algorithms and methods have been proposed to predict the effectiveness of rehabilitation programs. In general, research has focused on predicting the overall improvement of patients with ABI. The purpose of this study is the novel application of data mining (DM) techniques to predict the outcomes of cognitive rehabilitation in patients with ABI. We generate three predictive models that allow us to obtain new knowledge to evaluate and improve the effectiveness of the cognitive rehabilitation process. Decision tree (DT), multilayer perceptron (MLP) and general regression neural network (GRNN) have been used to construct the prediction models. 10-fold cross validation was carried out in order to test the algorithms, using the Institut Guttmann Neurorehabilitation Hospital (IG) patients database. Performance of the models was tested through specificity, sensitivity and accuracy analysis and confusion matrix analysis. The experimental results obtained by DT are clearly superior with a prediction average accuracy of 90.38%, while MLP and GRRN obtained a 78.7% and 75.96%, respectively. This study allows to increase the knowledge about the contributing factors of an ABI patient recovery and to estimate treatment efficacy in individual patients.
Resumo:
The “Innovatio Educativa Tertio Millennio” group has been 10 years developing educational innovation techniques, actually has reached the level of teaching on the technical teachers has developed, and share them with other groups, that can implement them in their teaching activities. UNESCO Chair of Mining and Industrial Heritage has been years working on heritage, and on the one hand teaching in conservation and maintenance of heritage, and on the other doing raise awareness of the meaning of heritage, the social value and as must be managed effectively. Recently these two groups work together, thus is spreading in a much more effective manner the concepts of heritage, its meaning, its value, and how to manage it and provide effective protection. On one hand being a work of dissemination based on internet and on radio broadcasting, and on the other one of teaching based on educational innovation, and courses, conferences, and face-to-face seminars or distance platforms.
Resumo:
Diabetes is the most common disease nowadays in all populations and in all age groups. diabetes contributing to heart disease, increases the risks of developing kidney disease, blindness, nerve damage, and blood vessel damage. Diabetes disease diagnosis via proper interpretation of the diabetes data is an important classification problem. Different techniques of artificial intelligence has been applied to diabetes problem. The purpose of this study is apply the artificial metaplasticity on multilayer perceptron (AMMLP) as a data mining (DM) technique for the diabetes disease diagnosis. The Pima Indians diabetes was used to test the proposed model AMMLP. The results obtained by AMMLP were compared with decision tree (DT), Bayesian classifier (BC) and other algorithms, recently proposed by other researchers, that were applied to the same database. The robustness of the algorithms are examined using classification accuracy, analysis of sensitivity and specificity, confusion matrix. The results obtained by AMMLP are superior to obtained by DT and BC.
Resumo:
A sustainable manufacturing process must rely on an also sustainable raw materials and energy supply. This paper is intended to show the results of the studies developed on sustainable business models for the minerals industry as a fundamental previous part of a sustainable manufacturing process. As it has happened in other economic activities, the mining and minerals industry has come under tremendous pressure to improve its social, developmental, and environmental performance. Mining, refining, and the use and disposal of minerals have in some instances led to significant local environmental and social damage. Nowadays, like in other parts of the corporate world, companies are more routinely expected to perform to ever higher standards of behavior, going well beyond achieving the best rate of return for shareholders. They are also increasingly being asked to be more transparent and subject to third-party audit or review, especially in environmental aspects. In terms of environment, there are three inter-related areas where innovation and new business models can make the biggest difference: carbon, water and biodiversity. The focus in these three areas is for two reasons. First, the industrial and energetic minerals industry has significant footprints in each of these areas. Second, these three areas are where the potential environmental impacts go beyond local stakeholders and communities, and can even have global impacts, like in the case of carbon. So prioritizing efforts in these areas will ultimately be a strategic differentiator as the industry businesses continues to grow. Over the next forty years, world?s population is predicted to rise from 6.300 million to 9.500 million people. This will mean a huge demand of natural resources. Indeed, consumption rates are such that current demand for raw materials will probably soon exceed the planet?s capacity. As awareness of the actual situation grows, the public is demanding goods and services that are even more environmentally sustainable. This means that massive efforts are required to reduce the amount of materials we use, including freshwater, minerals and oil, biodiversity, and marine resources. It?s clear that business as usual is no longer possible. Today, companies face not only the economic fallout of the financial crisis; they face the substantial challenge of transitioning to a low-carbon economy that is constrained by dwindling natural resources easily accessible. Innovative business models offer pioneering companies an early start toward the future. They can signal to consumers how to make sustainable choices and provide reward for both the consumer and the shareholder. Climate change and carbon remain major risk discontinuities that we need to better understand and deal with. In the absence of a global carbon solution, the principal objective of any individual country should be to reduce its global carbon emissions by encouraging conservation. The mineral industry internal response is to continue to focus on reducing the energy intensity of our existing operations through energy efficiency and the progressive introduction of new technology. Planning of the new projects must ensure that their energy footprint is minimal from the start. These actions will increase the long term resilience of the business to uncertain energy and carbon markets. This focus, combined with a strong demand for skills in this strategic area for the future requires an appropriate change in initial and continuing training of engineers and technicians and their awareness of the issue of eco-design. It will also need the development of measurement tools for consistent comparisons between companies and the assessments integration of the carbon footprint of mining equipments and services in a comprehensive impact study on the sustainable development of the Economy.
Resumo:
Abstract This paper presents a new method to extract knowledge from existing data sets, that is, to extract symbolic rules using the weights of an Artificial Neural Network. The method has been applied to a neural network with special architecture named Enhanced Neural Network (ENN). This architecture improves the results that have been obtained with multilayer perceptron (MLP). The relationship among the knowledge stored in the weights, the performance of the network and the new implemented algorithm to acquire rules from the weights is explained. The method itself gives a model to follow in the knowledge acquisition with ENN.