942 resultados para Citation classification schemes
Resumo:
Describes four waves of Ranganathan’s dynamic theory of classification. Outlines components that distinguish each wave, and porposes ways in which this understanding can inform systems design in the contemporary environment, particularly with regard to interoperability and scheme versioning. Ends with an appeal to better understanding the relationship between structure and semantics in faceted classification schemes and similar indexing languages.
Resumo:
Classification schemes undergo revision. However, in a networked environment revisions can be used to add dimensionality to classification. This dimensionality can be used to help explain conceptual warrant, explain the shift from disciplinary to multidisciplinary knowledge production, and as a component method of domain analysis. Further, subject ontogeny might be used in cooperative networked projects like digital preservation, online access tools, and interoperability frameworks.
Resumo:
As the universe of knowledge and subjects change over time, indexing languages like classification schemes, accommodate that change by restructuring. Restructuring indexing languages affects indexer and cataloguer work. Subjects may split or lump together. They may disappear only to reappear later. And new subjects may emerge that were assumed to be already present, but not clearly articulated (Miksa, 1998). In this context we have the complex relationship between the indexing language, the text being described, and the already described collection (Tennis, 2007). It is possible to imagine indexers placing a document into an outdated class, because it is the one they have already used for their collection. However, doing this erases the semantics in the present indexing language. Given this range of choice in the context of indexing language change, the question arises, what does this look like in practice? How often does this occur? Further, what does this phenomenon tell us about subjects in indexing languages? Does the practice we observe in the reaction to indexing language change provide us evidence of conceptual models of subjects and subject creation? If it is incomplete, but gets us close, what evidence do we still require?
Resumo:
Both basic and applied research on the construction, implementation, maintenance, and evaluation of classification schemes is called classification theory. If we employ Ritzer’s metatheoretical method of analysis on the over one-hundred year-old body of literature, we can see categories of theory emerge. This paper looks at one particular part of knowledge organization work, namely classification theory, and asks 1) what are the contours of this intellectual space, and, 2) what have we produced in the theoretical reflection on con- structing, implementing, and evaluating classification schemes? The preliminary findings from this work are that classification theory can be separated into three kinds: foundational classification theory, first-order classification theory, and second-order classification theory, each with its own concerns and objects of study.
Resumo:
tThis paper deals with the potential and limitations of using voice and speech processing to detect Obstruc-tive Sleep Apnea (OSA). An extensive body of voice features has been extracted from patients whopresent various degrees of OSA as well as healthy controls. We analyse the utility of a reduced set offeatures for detecting OSA. We apply various feature selection and reduction schemes (statistical rank-ing, Genetic Algorithms, PCA, LDA) and compare various classifiers (Bayesian Classifiers, kNN, SupportVector Machines, neural networks, Adaboost). S-fold crossvalidation performed on 248 subjects showsthat in the extreme cases (that is, 127 controls and 121 patients with severe OSA) voice alone is able todiscriminate quite well between the presence and absence of OSA. However, this is not the case withmild OSA and healthy snoring patients where voice seems to play a secondary role. We found that thebest classification schemes are achieved using a Genetic Algorithm for feature selection/reduction.
Resumo:
Hydrodynamic characteristics of an estuary resulting from interaction of tide and river runoff are important since problems regarding flood, salinity intrusion, water quality, ecosystem and sedimentation are ubiquitous. The present study focuses on such hydrodynamic aspects in the Cochin estuary. Most of the estuaries that come under the influence of Indian Summer Monsoon and for which the salinity is never in a steady state at any time of the year are generally shallow and convergent, i.e. the width decreases rapidly from mouth to head. In contrast, Cochin estuary is wider towards the upstream and has no typical river mouth, where the rivers are joining the estuary along the length of its channel .Adding to the complexity it has dual inlets and the tidal range is 1 m which is lower than other Indian estuaries along west coast. These typical physical features lead to its unique hydrodynamic characteristics. Therefore the thesis objectives are: I) to study the influence of river runoff on tidal propagation using observations and a numerical model ii) to study stratification and property distributions in Cochin estuary iii) to understand salinity distributions and flushing characteristics iv) to understand the influence of saltwater barrage on tides and salinity v) To evaluate several classification schemes for the estuary
Resumo:
The R-package “compositions”is a tool for advanced compositional analysis. Its basic functionality has seen some conceptual improvement, containing now some facilities to work with and represent ilr bases built from balances, and an elaborated subsys- tem for dealing with several kinds of irregular data: (rounded or structural) zeroes, incomplete observations and outliers. The general approach to these irregularities is based on subcompositions: for an irregular datum, one can distinguish a “regular” sub- composition (where all parts are actually observed and the datum behaves typically) and a “problematic” subcomposition (with those unobserved, zero or rounded parts, or else where the datum shows an erratic or atypical behaviour). Systematic classification schemes are proposed for both outliers and missing values (including zeros) focusing on the nature of irregularities in the datum subcomposition(s). To compute statistics with values missing at random and structural zeros, a projection approach is implemented: a given datum contributes to the estimation of the desired parameters only on the subcompositon where it was observed. For data sets with values below the detection limit, two different approaches are provided: the well-known imputation technique, and also the projection approach. To compute statistics in the presence of outliers, robust statistics are adapted to the characteristics of compositional data, based on the minimum covariance determinant approach. The outlier classification is based on four different models of outlier occur- rence and Monte-Carlo-based tests for their characterization. Furthermore the package provides special plots helping to understand the nature of outliers in the dataset. Keywords: coda-dendrogram, lost values, MAR, missing data, MCD estimator, robustness, rounded zeros
Resumo:
Large-scale bottom-up estimates of terrestrial carbon fluxes, whether based on models or inventory, are highly dependent on the assumed land cover. Most current land cover and land cover change maps are based on satellite data and are likely to be so for the foreseeable future. However, these maps show large differences, both at the class level and when transformed into Plant Functional Types (PFTs), and these can lead to large differences in terrestrial CO2 fluxes estimated by Dynamic Vegetation Models. In this study the Sheffield Dynamic Global Vegetation Model is used. We compare PFT maps and the resulting fluxes arising from the use of widely available moderate (1 km) resolution satellite-derived land cover maps (the Global Land Cover 2000 and several MODIS classification schemes), with fluxes calculated using a reference high (25 m) resolution land cover map specific to Great Britain (the Land Cover Map 2000). We demonstrate that uncertainty is introduced into carbon flux calculations by (1) incorrect or uncertain assignment of land cover classes to PFTs; (2) information loss at coarser resolutions; (3) difficulty in discriminating some vegetation types from satellite data. When averaged over Great Britain, modeled CO2 fluxes derived using the different 1 km resolution maps differ from estimates made using the reference map. The ranges of these differences are 254 gC m−2 a−1 in Gross Primary Production (GPP); 133 gC m−2 a−1 in Net Primary Production (NPP); and 43 gC m−2 a−1 in Net Ecosystem Production (NEP). In GPP this accounts for differences of −15.8% to 8.8%. Results for living biomass exhibit a range of 1109 gC m−2. The types of uncertainties due to land cover confusion are likely to be representative of many parts of the world, especially heterogeneous landscapes such as those found in western Europe.
Resumo:
Purpose – The creation of a target market strategy is integral to developing an effective business strategy. The concept of market segmentation is often cited as pivotal to establishing a target market strategy, yet all too often business-to-business marketers utilise little more than trade sectors or product groups as the basis for their groupings of customers, rather than customers' characteristics and buying behaviour. The purpose of this paper is to offer a solution for managers, focusing on customer purchasing behaviour, which evolves from the organisation's existing criteria used for grouping its customers. Design/methodology/approach – One of the underlying reasons managers fail to embrace best practice market segmentation is their inability to manage the transition from how target markets in an organisation are currently described to how they might look when based on customer characteristics, needs, purchasing behaviour and decision-making. Any attempt to develop market segments should reflect the inability of organisations to ignore their existing customer group classification schemes and associated customer-facing operational practices, such as distribution channels and sales force allocations. Findings – A straightforward process has been derived and applied, enabling organisations to practice market segmentation in an evolutionary manner, facilitating the transition to customer-led target market segments. This process also ensures commitment from the managers responsible for implementing the eventual segmentation scheme. This paper outlines the six stages of this process and presents an illustrative example from the agrichemicals sector, supported by other cases. Research implications – The process presented in this paper for embarking on market segmentation focuses on customer purchasing behaviour rather than business sectors or product group classifications - which is true to the concept of market segmentation - but in a manner that participating managers find non-threatening. The resulting market segments have their basis in the organisation's existing customer classification schemes and are an iteration to which most managers readily buy-in. Originality/value – Despite the size of the market segmentation literature, very few papers offer step-by-step guidance for developing customer-focused market segments in business-to-business marketing. The analytical tool for assessing customer purchasing deployed in this paper originally was created to assist in marketing planning programmes, but has since proved its worth as the foundation for creating segmentation schemes in business marketing, as described in this paper.
Resumo:
A numerous population of weak line galaxies (WLGs) is often left out of statistical studies on emission-line galaxies (ELGs) due to the absence of an adequate classification scheme, since classical diagnostic diagrams, such as [O iii]/H beta versus [N ii]/H alpha (the BPT diagram), require the measurement of at least four emission lines. This paper aims to remedy this situation by transposing the usual divisory lines between star-forming (SF) galaxies and active galactic nuclei (AGN) hosts and between Seyferts and LINERs to diagrams that are more economical in terms of line quality requirements. By doing this, we rescue from the classification limbo a substantial number of sources and modify the global census of ELGs. More specifically, (1) we use the Sloan Digital Sky Survey Data Release 7 to constitute a suitable sample of 280 000 ELGs, one-third of which are WLGs. (2) Galaxies with strong emission lines are classified using the widely applied criteria of Kewley et al., Kauffmann et al. and Stasinska et al. to distinguish SF galaxies and AGN hosts and Kewley et al. to distinguish Seyferts from LINERs. (3) We transpose these classification schemes to alternative diagrams keeping [N ii]/H alpha as a horizontal axis, but replacing H beta by a stronger line (H alpha or [O ii]), or substituting the ionization-level sensitive [O iii]/H beta ratio with the equivalent width of H alpha (W(H alpha)). Optimized equations for the transposed divisory lines are provided. (4) We show that nothing significant is lost in the translation, but that the new diagrams allow one to classify up to 50 per cent more ELGs. (5) Introducing WLGs in the census of galaxies in the local Universe increases the proportion of metal-rich SF galaxies and especially LINERs. In the course of this analysis, we were led to make the following points. (i) The Kewley et al. BPT line for galaxy classification is generally ill-used. (ii) Replacing [O iii]/H beta by W(H alpha) in the classification introduces a change in the philosophy of the distinction between LINERs and Seyferts, but not in its results. Because the W(H alpha) versus [N ii]/H alpha diagram can be applied to the largest sample of ELGs without loss of discriminating power between Seyferts and LINERs, we recommend its use in further studies. (iii) The dichotomy between Seyferts and LINERs is washed out by WLGs in the BPT plane, but it subsists in other diagnostic diagrams. This suggests that the right wing in the BPT diagram is indeed populated by at least two classes, tentatively identified with bona fide AGN and `retired` galaxies that have stopped forming stars and are ionized by their old stellar populations.
Resumo:
Lymphoma is among the most frequent canine neoplasia and share many similarities with human non-Hodgkin's lymphoma in respect of etiology, epidemiology, clinical, morphological and immunophenotipical aspects. Human classification schemes have been used in canine lymphoma. The aim of this work was apply Kiel, Working Formulation and Fournel-Fleury's et al. (1994) classification in Fine Needle Aspiration (FNA) cytology matherial. According to Kiel scheme 61.02% (36 cases) were high-grade lymphomas and 38.98% (23 cases) low grade. The Working Formulation, showed 11.86% (7 cases) of low grade, 61.02% (36 cases) intermediary grade and 27.12% (16 cases) high grade. In Fournel-Fleury's protocol revealed a predominance of high-grade lymphoma, with 61.02% (36 cases) over 38.98% (23 cases) of low grade. In conclusion, FNA can be used as a diagnostic method and in canine lymphoma cytological classification. Kiel's system showed the best results, once is based on cytologic basis.
Resumo:
Se realiza un análisis exploratorio del concepto de "garantía cultural", con el objetivo de caracterizar áreas futuras de investigación en torno al mismo. En primera instancia se analiza la noción genérica de 'garantía', tal como ha sido considerada en la Organización del Conocimiento. Se reseñan diversos tipos de garantías propuestas para legitimar la inclusión de terminología en sistemas de organización del conocimiento. Se cumple un análisis crítico del concepto de "cultura" y la manera en que distintas concepciones antropológicas, sociológicas y políticas confluyen en su construcción epistemológica. Se revisa y se problematiza el tratamiento de la garantía cultural en la literatura del área. Se valora su aporte en la construcción de identidades culturales, a través de elementos de diferenciación de la interpretación y la vivencia de la realidad. En particular se desarrolla la relación entre garantía cultural y cultura local. Se pondera la inserción del factor ético a través de la garantía cultura, en el desarrollo de esquemas de clasificación y en los procesos de clasificación e indización. Entre otras conclusiones, se establece la necesidad de explorar con mayor detenimiento las alternativas metodológicas que puedan sustentarse en esta concepción integradora y democratizadora en el ámbito de la Organización del Conocimiento
Resumo:
Se realiza un análisis exploratorio del concepto de "garantía cultural", con el objetivo de caracterizar áreas futuras de investigación en torno al mismo. En primera instancia se analiza la noción genérica de 'garantía', tal como ha sido considerada en la Organización del Conocimiento. Se reseñan diversos tipos de garantías propuestas para legitimar la inclusión de terminología en sistemas de organización del conocimiento. Se cumple un análisis crítico del concepto de "cultura" y la manera en que distintas concepciones antropológicas, sociológicas y políticas confluyen en su construcción epistemológica. Se revisa y se problematiza el tratamiento de la garantía cultural en la literatura del área. Se valora su aporte en la construcción de identidades culturales, a través de elementos de diferenciación de la interpretación y la vivencia de la realidad. En particular se desarrolla la relación entre garantía cultural y cultura local. Se pondera la inserción del factor ético a través de la garantía cultura, en el desarrollo de esquemas de clasificación y en los procesos de clasificación e indización. Entre otras conclusiones, se establece la necesidad de explorar con mayor detenimiento las alternativas metodológicas que puedan sustentarse en esta concepción integradora y democratizadora en el ámbito de la Organización del Conocimiento
Resumo:
Se realiza un análisis exploratorio del concepto de "garantía cultural", con el objetivo de caracterizar áreas futuras de investigación en torno al mismo. En primera instancia se analiza la noción genérica de 'garantía', tal como ha sido considerada en la Organización del Conocimiento. Se reseñan diversos tipos de garantías propuestas para legitimar la inclusión de terminología en sistemas de organización del conocimiento. Se cumple un análisis crítico del concepto de "cultura" y la manera en que distintas concepciones antropológicas, sociológicas y políticas confluyen en su construcción epistemológica. Se revisa y se problematiza el tratamiento de la garantía cultural en la literatura del área. Se valora su aporte en la construcción de identidades culturales, a través de elementos de diferenciación de la interpretación y la vivencia de la realidad. En particular se desarrolla la relación entre garantía cultural y cultura local. Se pondera la inserción del factor ético a través de la garantía cultura, en el desarrollo de esquemas de clasificación y en los procesos de clasificación e indización. Entre otras conclusiones, se establece la necesidad de explorar con mayor detenimiento las alternativas metodológicas que puedan sustentarse en esta concepción integradora y democratizadora en el ámbito de la Organización del Conocimiento
Resumo:
The grain-size distribution of 223 unconsolidated sediment samples from four DSDP sites at the mouth of the Gulf of California was determined using sieve and pipette techniques. Shepard's (1954) and Inman's (1952) classification schemes were used for all samples. Most of the sediments are hemipelagic with minor turbidites of terrigenous origin. Sediment texture ranges from silty sand to silty clay. On the basis of grain-size parameters, the sediments can be divided into the following groups: (1) poorly to very poorly sorted coarse and medium sand; and (2) poorly to very poorly sorted fine to very fine sand and clay.