925 resultados para Data clustering. Fuzzy C-Means. Cluster centers initialization. Validation indices
Resumo:
In this paper, we present ICICLE (Image ChainNet and Incremental Clustering Engine), a prototype system that we have developed to efficiently and effectively retrieve WWW images based on image semantics. ICICLE has two distinguishing features. First, it employs a novel image representation model called Weight ChainNet to capture the semantics of the image content. A new formula, called list space model, for computing semantic similarities is also introduced. Second, to speed up retrieval, ICICLE employs an incremental clustering mechanism, ICC (Incremental Clustering on ChainNet), to cluster images with similar semantics into the same partition. Each cluster has a summary representative and all clusters' representatives are further summarized into a balanced and full binary tree structure. We conducted an extensive performance study to evaluate ICICLE. Compared with some recently proposed methods, our results show that ICICLE provides better recall and precision. Our clustering technique ICC facilitates speedy retrieval of images without sacrificing recall and precision significantly.
Resumo:
Abstract: Purpose – This paper aims to document women's reflections on their careers over a ten-year period to provide quantitative baseline data on which to frame follow-up in-depth interviews. The participants work in the public service in Queensland (Australia) and had been recommended for, and participated in, women in management (WIM) courses conducted in the early 1990s. Design/methodology/approach – Data were collected by means of a survey (containing closed and open items) which gathered demographic data and data related to employment history, perceptions of success and satisfaction, and the women's future career expectations. Findings – Findings revealed that the percentage of women in middle and senior management had increased over the ten-year period, although not to the extent one might have anticipated, given that the women had been targeted as high flyers by their supervisors. While not content with their classification levels (i.e. seniority), the majority of the cohort viewed their careers as being successful. Practical implications – Questions arise from this study as to why women are still “not getting to the top”. There are also policy implications for the public service concerning women's possible “reinventive contribution” and training implications associated with women only courses. Originality/value – The study is part of an Australian longitudinal study on the careers of women who attended a prestigious women-only management course in the early 1990s in Queensland. This is now becoming a study of older women.
Resumo:
C-terminal acylation of Lys(37) with myristic (MYR; tetradecanoic acid), palmitic (PAL; hexadecanoic acid) and stearic (octadecanoic acid) fatty acids with or without N-terminal acetylation was employed to develop long-acting analogues of the glucoregulatory hormone, glucose-dependent insulinotropic polypeptide (GIP). All GIP analogues exhibited resistance to dipeptidylpeptidase-IV (DPP-IV) and significantly improved in vitro cAMP production and insulin secretion. Administration of GIP analogues to ob/ob mice significantly lowered plasma glucose-GIP(Lys(37)MYR), N-AcGIP(Lys(37)MYR) and GIP(Lys(37)PAL) increased plasma insulin concentrations. GIP(Lys(37)MYR) and N-AcGIP(Lys(37)MYR) elicited protracted glucose-lowering effects when administered 24h prior to an intraperitoneal glucose load. Daily administration of GIP(Lys(37)MYR) and N-AcGIP(Lys(37)MYR) to ob/ob mice for 24 days decreased glucose and significantly improved plasma insulin, glucose tolerance and beta-cell glucose responsiveness. Insulin sensitivity, pancreatic insulin content and triglyceride levels were not changed. These data demonstrate that C-terminal acylation particularly with myristic acid provides a class of stable, longer-acting forms of GIP for further evaluation in diabetes therapy.
Resumo:
Innovation has long been an area of interest to social scientists, and particularly to psychologists working in organisational settings. The team climate inventory (TCI) is a facet-specific measure of team climate for innovation that provides a picture of the level and quality of teamwork in a unit using a series of Likert scales. This paper describes its Italian validation in 585 working group members employed in health-related and other contexts. The data were evaluated by means of factorial analysis (including an analysis of the internal consistency of the scales) and Pearson’s product moment correlations. The results show the internal consistency of the scales and the satisfactory factorial structure of the inventory, despite some variations in the factorial structure mainly due to cultural differences and the specific nature of Italian organisational systems.
Resumo:
This paper uses a meta-Malmquist index for measuring productivity change of the water industry in England and Wales and compares this to the traditional Malmquist index. The meta-Malmquist index computes productivity change with reference to a meta-frontier, it is computationally simpler and it is circular. The analysis covers all 22 UK water companies in existence in 2007, using data over the period 1993–2007. We focus on operating expenditure in line with assessments in this field, which treat operating and capital expenditure as lacking substitutability. We find important improvements in productivity between 1993 and 2005, most of which were due to frontier shifts rather than catch up to the frontier by companies. After 2005, the productivity shows a declining trend. We further use the meta-Malmquist index to compare the productivities of companies at the same and at different points in time. This shows some interesting results relating to the productivity of each company relative to that of other companies over time, and also how the performance of each company relative to itself over 1993–2007 has evolved. The paper is grounded in the broad theory of methods for measuring productivity change, and more specifically on the use of circular Malmquist indices for that purpose. In this context, the contribution of the paper is methodological and applied. From the methodology perspective, the paper demonstrates the use of circular meta-Malmquist indices in a comparative context not only across companies but also within company across time. This type of within-company assessment using Malmquist indices has not been applied extensively and to the authors’ knowledge not to the UK water industry. From the application perspective, the paper throws light on the performance of UK water companies and assesses the potential impact of regulation on their performance. In this context, it updates the relevant literature using more recent data.
Resumo:
The themes of this thesis are that international trade and foreign direct investment (FDI) are closely related and that they have varying impacts on economic growth in countries at different stages of development. The thesis consists of three empirical studies. The first one examines the causal relationship between FDI and trade in China. The empirical study is based on a panel of bilateral data for China and 19 home countries/regions over the period 1984-98. The specific feature of the study is that econometric techniques designed specially for panel data are applied to test for unit roots and causality. The results indicate a virtuous procedure of development for China. The growth of China’s imports causes growth in inward FDI from a home country/region, which in turn causes the growth of exports from China to the home country/region. The growth of exports causes the growth of imports. This virtuous procedure is the result of China’s policy of opening to the outside world. China has been encouraging export-oriented FDI and reducing trade barriers. Such policy instruments should be further encouraged in order to enhance economic growth. In the second study, an extended gravity model is constructed to identify the main causes of recent trade growth in OECD countries. The specific features include (a) the explicit introduction of R&D and FDI as two important explanatory variables into an augmented gravity equation; (b) the adoption of a panel data approach, and (c) the careful treatment of endogeneity. The main findings are that the levels and similarities of market size, domestic R&D stock and inward FDI stock are positively related to the volume of bilateral trade, while the geographical distance, exchange rate and relative factor endowments, has a negative impact. These findings lend support to new trade, FDI and economic growth theories. The third study evaluates the impact of openness on growth in different country groups. This research distinguishes itself from many existing studies in three aspects: first, both trade and FDI are included in the measurement of openness. Second, countries are divided' into three groups according to their development stages to compare the roles of FDI and trade in different groups. Third, the possible problems of endogeneity and multicollinearity of FDI and trade are carefully dealt with in a panel data setting. The main findings are that FDI and trade are both beneficial to a country's development. However, trade has positive effects on growth in all country groups but FDI has positive effects on growth only in the country groups which have had moderate development. The findings suggest FDI and trade may affect growth under different conditions.
Resumo:
Bedrock geochemical analysis, coupled with detailed data analysis, was carried out on some 260 samples taken from two areas of 'the Harlech Dome, near Dolgellau, North Wales. This was done to determine if rocks from mineralised and non-mineralised areas could be distinguished, and to determine mineralisation types and wall rock alterations. The Northern Area, near Talsarnau, has no recorded mineralisation, while the Southern Area, near Bontddu, has been exploited for gold. The rocks sampled, in both areas, were from the Cambrian Gamlan Flags, Clogau Shales, Vigra Flags, later vein materials, and igneous intrusions. All samples were analysed, using a new rapid, atomic absorption spectrophotometric technique, for Si, AI, Fe, Cu, Ni, Zn, Pb, Sr, Hg, and Ba. In addition 60 samples were analysed by X-ray fluorescence for Mn, Ti, Ca, K, Na, P, Cr, Ce, La, S, Y , Rh, and Th. Total CO2 was determined, on selected samples, using a combustion technique. Elemental distributions, for each rock type, in each area, were· plotted, and means, standard deviations, and enrichment indices were calculated. Multivariate statistical analysis on the results distinguished a Cu-type mineralisation in the Northern area, and both Cu and Pb/Zn types in the Southern Area. It also showed the Northern Area to be less strongly mineralised than the Southern one in which both mineralisation types are associated with wall rock alteration. Elemental associations and trends due to sedimentary processes were distinguished from those related to mineralisation. Hg is related to mineralisation, and plots of factor scores, on the sampling grid, produced clusters of mineralisation related factors in areas of known mineralisation. A double Fourier Trend Analysis program, with a wavelength search routine, was developed and used to recognise sedimentary trends for Sr. Y., Rb, and Th. These trends were interpreted to represent areas of low pH and reducing conditions. They also indicate that the supply of sediment remained constant over Gamlan, Clogau, and Vigra times. The trend surface of Hg showed no association with rock type. It is shown that analysis of a small number of samples, for a carefully selected number of elements, with detailed data analysis, can provide more useful information than analysis of a large number of samples for many elements. The mineralisation is suggested to have been the result of water solutions leaching ore metals from the sedimentary rocks and redepositing them in veins.
Resumo:
Differential perception of innovation is a research area which has been advocated as a suitable topic for study in recent years. It developed from the problems encountered within earlier perception of innovation studies which sought to establish what characteristics of an innovation affected the ease of its adoption. While some success was achieved In relating perception of innovation to adoption behaviour, variability encountered Within groups expected - to fercelve innovation similarly suggested that the needs and experiences of the potential adopter were significantly affecting the research findings. Such analysis being supported by both sociological and psychological perceptual research. The present study sought to identify the presence of differential perception of innovation and explore the nature of the process. It was decided to base the research in an organisational context and to concentrate upon manufacturing innovation. It has been recognised that such adoption of technological innovation is commonly the product of a collective decision-making process, involving individuals from a variety of occupational backgrounds, both in terms of occupational speciality and level within the hierarchy. Such roles appeared likely to significantly influence perception of technological innovation, as gathered through an appropriate measure and were readily identifiable. Data vas collected by means of a face-to-face card presentation technique, a questionnaire and through case study material. Differential perception of innovation effects were apparent In the results, many similarities and differences of perception being related to the needs and experiences of the individuals studied. Phenomenological analysis, which recognises the total nature of experience in infiuencing behaviour, offered the best means of explaining the findings. It was also clear that the bureaucratic model of role definition was not applicable to the area studied, it seeming likely that such definitions are weaker under conditions of uncertainty, such as encountered in innovative decision-making.
Resumo:
What influence do marketing departments have in companies today? Which factors determine this influence? These are the issues discussed in the present article. Empirical evidence based on data from companies in the Netherlands demonstrates that accountability, innovativeness and customer connections are the three major drivers of influence. The need for a strong marketing department within companies is also discussed, supported by empirical data.
Resumo:
Volunteered Geographic Information (VGI) represents a growing source of potentially valuable data for many applications, including land cover map validation. It is still an emerging field and many different approaches can be used to take value from VGI, but also many pros and cons are related to its use. Therefore, since it is timely to get an overview of the subject, the aim of this article is to review the use of VGI as reference data for land cover map validation. The main platforms and types of VGI that are used and that are potentially useful are analysed. Since quality is a fundamental issue in map validation, the quality procedures used by the platforms that collect VGI to increase and control data quality are reviewed and a framework for addressing VGI quality assessment is proposed. A review of cases where VGI was used as an additional data source to assist in map validation is made, as well as cases where only VGI was used, indicating the procedures used to assess VGI quality and fitness for use. A discussion and some conclusions are drawn on best practices, future potential and the challenges of the use of VGI for land cover map validation.
Resumo:
This paper presents an effective decision making system for leak detection based on multiple generalized linear models and clustering techniques. The training data for the proposed decision system is obtained by setting up an experimental pipeline fully operational distribution system. The system is also equipped with data logging for three variables; namely, inlet pressure, outlet pressure, and outlet flow. The experimental setup is designed such that multi-operational conditions of the distribution system, including multi pressure and multi flow can be obtained. We then statistically tested and showed that pressure and flow variables can be used as signature of leak under the designed multi-operational conditions. It is then shown that the detection of leakages based on the training and testing of the proposed multi model decision system with pre data clustering, under multi operational conditions produces better recognition rates in comparison to the training based on the single model approach. This decision system is then equipped with the estimation of confidence limits and a method is proposed for using these confidence limits for obtaining more robust leakage recognition results.
Resumo:
The purpose of this study was to investigate the effect of multimedia instruction on achievement of college students in AMR 2010 from exploration and discovery to 1865. A non-equivalent control group design was used. The dependent variable was achievement. The independent variables were learning styles, method of instruction, and visual clarifiers (notes). The study was conducted using two history sections from Palm Beach Community College, in Boca Raton, Florida, between August and December, 1998. Data were obtained by means of placement scores, posttests, the Productivity Environmental Preference Survey (PEPS), and a researcher-developed student survey. Statistical analysis of the data was done using SPSS statistical software. Demographic variables were compared using Chi square. T tests were run on the posttests to determine the equality of variances. The posttest scores of the groups were compared using the analysis of covariance (ANCOVA) at the .05 level of significance. The first hypothesis there is a significant difference in students' learning of U.S. History when students receive multimedia instruction was supported, F (1, 52) = 16.88, p < .0005, and F = (1, 53) = 8.52, p < .005 for Tests 2 and 3, respectively. The second hypothesis there is a significant difference on the effectiveness of multimedia instruction based on students' various learning preferences was not supported. The last hypotheses there is a significant difference on students' learning of U.S. History when students whose first language is other than English and students who need remediation receive visual clarifiers were not supported. Analysis of covariance (ANCOVA) indicated no difference between the groups on Test 1, Test 2, or Test 3: F (1, 45) = .01, p < .940, F (1, 52) = .77, p < .385, and F (1, 53) =.17, p < .678, respectively, for language. Analysis of covariance (ANCOVA) indicated no significant difference on Test 1, Test 2, or Test 3, between the groups on the variable remediation: F (1, 45) = .31, p < .580, F (1, 52) = 1.44, p < .236, and F (1, 53) = .21, p < .645, respectively. ^
Resumo:
With the advent of peer to peer networks, and more importantly sensor networks, the desire to extract useful information from continuous and unbounded streams of data has become more prominent. For example, in tele-health applications, sensor based data streaming systems are used to continuously and accurately monitor Alzheimer's patients and their surrounding environment. Typically, the requirements of such applications necessitate the cleaning and filtering of continuous, corrupted and incomplete data streams gathered wirelessly in dynamically varying conditions. Yet, existing data stream cleaning and filtering schemes are incapable of capturing the dynamics of the environment while simultaneously suppressing the losses and corruption introduced by uncertain environmental, hardware, and network conditions. Consequently, existing data cleaning and filtering paradigms are being challenged. This dissertation develops novel schemes for cleaning data streams received from a wireless sensor network operating under non-linear and dynamically varying conditions. The study establishes a paradigm for validating spatio-temporal associations among data sources to enhance data cleaning. To simplify the complexity of the validation process, the developed solution maps the requirements of the application on a geometrical space and identifies the potential sensor nodes of interest. Additionally, this dissertation models a wireless sensor network data reduction system by ascertaining that segregating data adaptation and prediction processes will augment the data reduction rates. The schemes presented in this study are evaluated using simulation and information theory concepts. The results demonstrate that dynamic conditions of the environment are better managed when validation is used for data cleaning. They also show that when a fast convergent adaptation process is deployed, data reduction rates are significantly improved. Targeted applications of the developed methodology include machine health monitoring, tele-health, environment and habitat monitoring, intermodal transportation and homeland security.
Resumo:
Recent studies suggest that coastal ecosystems can bury significantly more C than tropical forests, indicating that continued coastal development and exposure to sea level rise and storms will have global biogeochemical consequences. The Florida Coastal Everglades Long Term Ecological Research (FCE LTER) site provides an excellent subtropical system for examining carbon (C) balance because of its exposure to historical changes in freshwater distribution and sea level rise and its history of significant long-term carbon-cycling studies. FCE LTER scientists used net ecosystem C balance and net ecosystem exchange data to estimate C budgets for riverine mangrove, freshwater marsh, and seagrass meadows, providing insights into the magnitude of C accumulation and lateral aquatic C transport. Rates of net C production in the riverine mangrove forest exceeded those reported for many tropical systems, including terrestrial forests, but there are considerable uncertainties around those estimates due to the high potential for gain and loss of C through aquatic fluxes. C production was approximately balanced between gain and loss in Everglades marshes; however, the contribution of periphyton increases uncertainty in these estimates. Moreover, while the approaches used for these initial estimates were informative, a resolved approach for addressing areas of uncertainty is critically needed for coastal wetland ecosystems. Once resolved, these C balance estimates, in conjunction with an understanding of drivers and key ecosystem feedbacks, can inform cross-system studies of ecosystem response to long-term changes in climate, hydrologic management, and other land use along coastlines.
Resumo:
With the advent of peer to peer networks, and more importantly sensor networks, the desire to extract useful information from continuous and unbounded streams of data has become more prominent. For example, in tele-health applications, sensor based data streaming systems are used to continuously and accurately monitor Alzheimer's patients and their surrounding environment. Typically, the requirements of such applications necessitate the cleaning and filtering of continuous, corrupted and incomplete data streams gathered wirelessly in dynamically varying conditions. Yet, existing data stream cleaning and filtering schemes are incapable of capturing the dynamics of the environment while simultaneously suppressing the losses and corruption introduced by uncertain environmental, hardware, and network conditions. Consequently, existing data cleaning and filtering paradigms are being challenged. This dissertation develops novel schemes for cleaning data streams received from a wireless sensor network operating under non-linear and dynamically varying conditions. The study establishes a paradigm for validating spatio-temporal associations among data sources to enhance data cleaning. To simplify the complexity of the validation process, the developed solution maps the requirements of the application on a geometrical space and identifies the potential sensor nodes of interest. Additionally, this dissertation models a wireless sensor network data reduction system by ascertaining that segregating data adaptation and prediction processes will augment the data reduction rates. The schemes presented in this study are evaluated using simulation and information theory concepts. The results demonstrate that dynamic conditions of the environment are better managed when validation is used for data cleaning. They also show that when a fast convergent adaptation process is deployed, data reduction rates are significantly improved. Targeted applications of the developed methodology include machine health monitoring, tele-health, environment and habitat monitoring, intermodal transportation and homeland security.