864 resultados para consistency in indexing
Resumo:
Presents a citation analysis of indexing research in the two Proceedings. Understanding that there are different traditions of research into indexing, we look for evidence of this in the citing and cited authors. Three areas of cited and citing authors surface, after applying Price's elitism analysis, each roughly corresponding to geographic distributions.
Resumo:
Increasing public interest in science information in a digital and 2.0 science era promotes a dramatically, rapid and deep change in science itself. The emergence and expansion of new technologies and internet-based tools is leading to new means to improve scientific methodology and communication, assessment, promotion and certification. It allows methods of acquisition, manipulation and storage, generating vast quantities of data that can further facilitate the research process. It also improves access to scientific results through information sharing and discussion. Content previously restricted only to specialists is now available to a wider audience. This context requires new management systems to make scientific knowledge more accessible and useable, including new measures to evaluate the reach of scientific information. The new science and research quality measures are strongly related to the new online technologies and services based in social media. Tools such as blogs, social bookmarks and online reference managers, Twitter and others offer alternative, transparent and more comprehensive information about the active interest, usage and reach of scientific publications. Another of these new filters is the Research Blogging platform, which was created in 2007 and now has over 1,230 active blogs, with over 26,960 entries posted about peer-reviewed research on subjects ranging from Anthropology to Zoology. This study takes a closer look at RB, in order to get insights into its contribution to the rapidly changing landscape of scientific communication.
Resumo:
Hypertension is a powerful treatable risk factor for stroke. Reports of randomized controlled trials (RCTs) of antihypertensive drugs rightly concentrate on clinical outcomes, but control of blood pressure (BP) during follow-up is also important, particularly given that inconsistent control is associated with a high risk of stroke and that antihypertensive drug classes differ in this regard.
Resumo:
Conservation strategies for long-lived vertebrates require accurate estimates of parameters relative to the populations' size, numbers of non-breeding individuals (the “cryptic” fraction of the population) and the age structure. Frequently, visual survey techniques are used to make these estimates but the accuracy of these approaches is questionable, mainly because of the existence of numerous potential biases. Here we compare data on population trends and age structure in a bearded vulture (Gypaetus barbatus) population from visual surveys performed at supplementary feeding stations with data derived from population matrix-modelling approximations. Our results suggest that visual surveys overestimate the number of immature (<2 years old) birds, whereas subadults (3–5 y.o.) and adults (>6 y.o.) were underestimated in comparison with the predictions of a population model using a stable-age distribution. In addition, we found that visual surveys did not provide conclusive information on true variations in the size of the focal population. Our results suggest that although long-term studies (i.e. population matrix modelling based on capture-recapture procedures) are a more time-consuming method, they provide more reliable and robust estimates of population parameters needed in designing and applying conservation strategies. The findings shown here are likely transferable to the management and conservation of other long-lived vertebrate populations that share similar life-history traits and ecological requirements.
Resumo:
Family preservation has been criticized for implementing programs that are not theoretically founded. One result of this circumstance is a lack of information regarding processes and outcomes of family preservation services. The knowledge base of family preservation is thus rather limited at present and will remain limited unless theory is consistently integrated within individual programs. A model for conceptualizing how theoretical consistency may be implemented within programs is presented and applied to family preservation. It is also necessary for programs to establish theoretical consistency before theoretical diversity, both within individual and across multiple programs, in order to advance the field in meaningful ways. A developmental cycle of knowledge generation is presented and applied to family preservation.
Resumo:
This article examines the determinants of positional incongruence between pre-election statements and post-election behaviour in the Swiss parliament between 2003 and 2009. The question is examined at the individual MP level, which is appropriate for dispersion-of-powers systems like Switzerland. While the overall rate of political congruence reaches about 85%, a multilevel logit analysis detects the underlying factors which push or curb a candidate's propensity to change his or her mind once elected. The results show that positional changes are more likely when (1) MPs are freshmen, (2) individual voting behaviour is invisible to the public, (3) the electoral district magnitude is not small, (4) the vote is not about a party's core issue, (5) the MP belongs to a party which is located in the political centre, and (6) if the pre-election statement dissents from the majority position of the legislative party group. Of these factors, the last one is paramount.
Resumo:
Studying individual differences in conscious awareness can potentially lend fundamental insights into the neural bases of binding mechanisms and consciousness (Cohen Kadosh and Henik, 2007). Partly for this reason, considerable attention has been devoted to the neural mechanisms underlying grapheme–color synesthesia, a healthy condition involving atypical brain activation and the concurrent experience of color photisms in response to letters, numbers, and words. For instance, the letter C printed in black on a white background may elicit a yellow color photism that is perceived to be spatially colocalized with the inducing stimulus or internally in the “mind's eye” as, for instance, a visual image. Synesthetic experiences are involuntary, idiosyncratic, and consistent over time (Rouw et al., 2011). To date, neuroimaging research on synesthesia has focused on brain areas activated during the experience of synesthesia and associated structural brain differences. However, activity patterns of the synesthetic brain at rest remain largely unexplored. Moreover, the neural correlates of synesthetic consistency, the hallmark characteristic of synesthesia, remain elusive.
Resumo:
The theoretical formulation of the smoothed particle hydrodynamics (SPH) method deserves great care because of some inconsistencies occurring when considering free-surface inviscid flows. Actually, in SPH formulations one usually assumes that (i) surface integral terms on the boundary of the interpolation kernel support are neglected, (ii) free-surface conditions are implicitly verified. These assumptions are studied in detail in the present work for free-surface Newtonian viscous flow. The consistency of classical viscous weakly compressible SPH formulations is investigated. In particular, the principle of virtual work is used to study the verification of the free-surface boundary conditions in a weak sense. The latter can be related to the global energy dissipation induced by the viscous term formulations and their consistency. Numerical verification of this theoretical analysis is provided on three free-surface test cases including a standing wave, with the three viscous term formulations investigated.
Resumo:
In this paper, the authors introduce a novel mechanism for data management in a middleware for smart home control, where a relational database and semantic ontology storage are used at the same time in a Data Warehouse. An annotation system has been designed for instructing the storage format and location, registering new ontology concepts and most importantly, guaranteeing the Data Consistency between the two storage methods. For easing the data persistence process, the Data Access Object (DAO) pattern is applied and optimized to enhance the Data Consistency assurance. Finally, this novel mechanism provides an easy manner for the development of applications and their integration with BATMP. Finally, an application named "Parameter Monitoring Service" is given as an example for assessing the feasibility of the system.
Resumo:
Classical Guitar Music in Printed Collections is a new, open-access, online index to the contents of published score collections for classical guitar. Its interlinked, alphabetized lists allow one to find a composition by title or composer, to discover what score collections include that piece, to see what other works are included in each collection identified, and to locate a copy in a library collection. Accuracy of identification is guaranteed by incipit images of each work. The article discusses how this index differs from existing bibliographies of the classical guitar literature, its structure and design, and technical details of its publication.
Resumo:
The thermodynamic consistency of almost 90 VLE data series, including isothermal and isobaric conditions for systems of both total and partial miscibility in the liquid phase, has been examined by means of the area and point-to-point tests. In addition, the Gibbs energy of mixing function calculated from these experimental data has been inspected, with some rather surprising results: certain data sets exhibiting high dispersion or leading to Gibbs energy of mixing curves inconsistent with the total or partial miscibility of the liquid phase, surprisingly, pass the tests. Several possible inconsistencies in the tests themselves or in their application are discussed. Related to this is a very interesting and ambitious initiative that arose within the NIST organization: the development of an algorithm to assess the quality of experimental VLE data. The present paper questions the applicability of two of the five tests that are combined in the algorithm. It further shows that the deviation of the experimental VLE data from the correlation obtained by a given model, the basis of some point-to-point tests, should not be used to evaluate the quality of these data.
Resumo:
"UILU-ENG 83-1724."--Cover.