955 resultados para Graphic consistency
Resumo:
El CELADE elabora proyecciones nacionales de poblacion para los 20 paises de la region, en la mayoria de los casos en colaboracion con las oficinas gubernamentales, lo que le otorga a aquellas un caracter oficial. El Centro tambien asesora a los paises en la elaboracion de proyecciones especificas y presta asistencia tecnica; en forma reciente ha disenado un paquete computacional para la preparacion y analisis preliminares de las proyecciones de poblacion. Los problemas mas frecuentes que surgen del trabajo conjunto con los paises dicen relacion con la rotacion del personal administrativo y profesional, la falta de personal calificado, la formulacion de metas no consistentes con la realidad demografica del pais.
Resumo:
Techniques of image capture have advanced along with the technologies of information and communication and unthinkable numbers of information available and imagery are stored in digital environments. The objective of this study is point out difficulties found in the construction of imagetic representations of digital resources using the instruments available for the treatment of descriptive information. The results we have the mapping of descriptive elements to digital images derived from analyzing of the schemes to guide the construction of descriptive records (AACR2R, ISBD, Graphic Materials, RDA, CDWA, CCO) and the conceptual model FRBRer. The result of this analysis conducted the conceptual model, Functional Requirements for Digital Imagetic Data RFDID to the development of more efficient ways to represent the use of imagery in order to make it available, accessible and recoverable from the data persistence descriptive, flexibility, consistency and integrity as essential requirements for the representation of the digital image.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
From a theorethical enunciative-discursive perspective, this article aims at discussing the process of the text consistency in digital context, in a particular manner, by means of problematic semiotic modes and resources updated in the academic production of the student who uses internet access computers in the semi-present Distant Learning (DL) process. It’s of interest to investigate how the academic literacies model may be linked to the multimodality study, considering, in theory, that in an electronic environment, the student has “unlimited” access to every and any kind of text, not just the “verbal” (name generally attributed to the graphic component) one. The collected material contains texts which were produced by the semi-present Pedagogy Course students from the Virtual University of São Paulo (UNIVESP), in 2010.
Resumo:
A comparative study of the primary properties of six cocoa butter samples, representative of industrial blends and cocoa butter extracted from fruits cultivated in different geographical areas in Brazil is presented. The samples were evaluated according to fatty acid composition, triacylglycerol composition, regiospecific distribution, melting point, solid fat content and consistency. The results allowed for differentiating the samples according to their chemical compositions, thermal resistance properties, hardness characteristics, as well as technological adequacies and potential use in regions with tropical climates.
Resumo:
Since there was no Portuguese questionnaire to evaluate cutaneous allodynia, which has been pointed out as a risk factor of migraine, we aimed to perform the cross-cultural adaptation of the 12 item Allodynia Symptom Checklist for the Brazilian population and to test its measurement properties. It consisted in six stages: translation, synthesis, back translation, revision by a specialist committee, pretest and submission the documents to the committee. In the pretest stage, the questionnaire was applied to 30 migraineurs of both sexes, who had some difficulty in understanding it. Thus, a second version was applied to 30 additional subjects, with no difficulties being reported. The mean filling out time was 3'36", and the internal consistency was 0.76. To test reproducibility, 15 other subjects filled out the questionnaire at two different times, it was classified as moderate (weighted kappa=0.58). We made available to Brazilian population an easy, quick and reliable questionnaire.
Resumo:
We address the spherical accretion of generic fluids onto black holes. We show that, if the black hole metric satisfies certain conditions, in the presence of a test fluid it is possible to derive a fully relativistic prescription for the black hole mass variation. Although the resulting equation may seem obvious due to a form of it appearing as a step in the derivation of the Schwarzschild metric, this geometrical argument is necessary to fix the added degree of freedom one gets for allowing the mass to vary with time. This result has applications on cosmological accretion models and provides a derivation from first principles to serve as a basis to the accretion equations already in use in the literature.
Resumo:
In this paper we have quantified the consistency of word usage in written texts represented by complex networks, where words were taken as nodes, by measuring the degree of preservation of the node neighborhood. Words were considered highly consistent if the authors used them with the same neighborhood. When ranked according to the consistency of use, the words obeyed a log-normal distribution, in contrast to Zipf's law that applies to the frequency of use. Consistency correlated positively with the familiarity and frequency of use, and negatively with ambiguity and age of acquisition. An inspection of some highly consistent words confirmed that they are used in very limited semantic contexts. A comparison of consistency indices for eight authors indicated that these indices may be employed for author recognition. Indeed, as expected, authors of novels could be distinguished from those who wrote scientific texts. Our analysis demonstrated the suitability of the consistency indices, which can now be applied in other tasks, such as emotion recognition.
Resumo:
A twisted generalized Weyl algebra A of degree n depends on a. base algebra R, n commuting automorphisms sigma(i) of R, n central elements t(i) of R and on some additional scalar parameters. In a paper by Mazorchuk and Turowska, it is claimed that certain consistency conditions for sigma(i) and t(i) are sufficient for the algebra to be nontrivial. However, in this paper we give all example which shows that this is false. We also correct the statement by finding a new set of consistency conditions and prove that the old and new conditions together are necessary and sufficient for the base algebra R to map injectively into A. In particular they are sufficient for the algebra A to be nontrivial. We speculate that these consistency relations may play a role in other areas of mathematics, analogous to the role played by the Yang-Baxter equation in the theory of integrable systems.
Resumo:
The current cosmological dark sector (dark matter plus dark energy) is challenging our comprehension about the physical processes taking place in the Universe. Recently, some authors tried to falsify the basic underlying assumptions of such dark matterdark energy paradigm. In this Letter, we show that oversimplifications of the measurement process may produce false positives to any consistency test based on the globally homogeneous and isotropic ? cold dark matter (?CDM) model and its expansion history based on distance measurements. In particular, when local inhomogeneity effects due to clumped matter or voids are taken into account, an apparent violation of the basic assumptions (Copernican Principle) seems to be present. Conversely, the amplitude of the deviations also probes the degree of reliability underlying the phenomenological DyerRoeder procedure by confronting its predictions with the accuracy of the weak lensing approach. Finally, a new method is devised to reconstruct the effects of the inhomogeneities in a ?CDM model, and some suggestions of how to distinguish between clumpiness (or void) effects from different cosmologies are discussed.
Resumo:
The self-consistency of a thermodynamical theory for hadronic systems based on the non-extensive statistics is investigated. We show that it is possible to obtain a self-consistent theory according to the asymptotic bootstrap principle if the mass spectrum and the energy density increase q-exponentially. A direct consequence is the existence of a limiting effective temperature for the hadronic system. We show that this result is in agreement with experiments. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
This Phd thesis was entirely developed at the Telescopio Nazionale Galileo (TNG, Roque de los Muchachos, La Palma Canary Islands) with the aim of designing, developing and implementing a new Graphical User Interface (GUI) for the Near Infrared Camera Spectrometer (NICS) installed on the Nasmyth A of the telescope. The idea of a new GUI for NICS has risen for optimizing the astronomers work through a set of powerful tools not present in the existing GUI, such as the possibility to move automatically, an object on the slit or do a very preliminary images analysis and spectra extraction. The new GUI also provides a wide and versatile image display, an automatic procedure to find out the astronomical objects and a facility for the automatic image crosstalk correction. In order to test the overall correct functioning of the new GUI for NICS, and providing some information on the atmospheric extinction at the TNG site, two telluric standard stars have been spectroscopically observed within some engineering time, namely Hip031303 and Hip031567. The used NICS set-up is as follows: Large Field (0.25'' /pixel) mode, 0.5'' slit and spectral dispersion through the AMICI prism (R~100), and the higher resolution (R~1000) JH and HK grisms.
Resumo:
Microprocessori basati su singolo processore (CPU), hanno visto una rapida crescita di performances ed un abbattimento dei costi per circa venti anni. Questi microprocessori hanno portato una potenza di calcolo nell’ordine del GFLOPS (Giga Floating Point Operation per Second) sui PC Desktop e centinaia di GFLOPS su clusters di server. Questa ascesa ha portato nuove funzionalità nei programmi, migliori interfacce utente e tanti altri vantaggi. Tuttavia questa crescita ha subito un brusco rallentamento nel 2003 a causa di consumi energetici sempre più elevati e problemi di dissipazione termica, che hanno impedito incrementi di frequenza di clock. I limiti fisici del silicio erano sempre più vicini. Per ovviare al problema i produttori di CPU (Central Processing Unit) hanno iniziato a progettare microprocessori multicore, scelta che ha avuto un impatto notevole sulla comunità degli sviluppatori, abituati a considerare il software come una serie di comandi sequenziali. Quindi i programmi che avevano sempre giovato di miglioramenti di prestazioni ad ogni nuova generazione di CPU, non hanno avuto incrementi di performance, in quanto essendo eseguiti su un solo core, non beneficiavano dell’intera potenza della CPU. Per sfruttare appieno la potenza delle nuove CPU la programmazione concorrente, precedentemente utilizzata solo su sistemi costosi o supercomputers, è diventata una pratica sempre più utilizzata dagli sviluppatori. Allo stesso tempo, l’industria videoludica ha conquistato una fetta di mercato notevole: solo nel 2013 verranno spesi quasi 100 miliardi di dollari fra hardware e software dedicati al gaming. Le software houses impegnate nello sviluppo di videogames, per rendere i loro titoli più accattivanti, puntano su motori grafici sempre più potenti e spesso scarsamente ottimizzati, rendendoli estremamente esosi in termini di performance. Per questo motivo i produttori di GPU (Graphic Processing Unit), specialmente nell’ultimo decennio, hanno dato vita ad una vera e propria rincorsa alle performances che li ha portati ad ottenere dei prodotti con capacità di calcolo vertiginose. Ma al contrario delle CPU che agli inizi del 2000 intrapresero la strada del multicore per continuare a favorire programmi sequenziali, le GPU sono diventate manycore, ovvero con centinaia e centinaia di piccoli cores che eseguono calcoli in parallelo. Questa immensa capacità di calcolo può essere utilizzata in altri campi applicativi? La risposta è si e l’obiettivo di questa tesi è proprio quello di constatare allo stato attuale, in che modo e con quale efficienza pùo un software generico, avvalersi dell’utilizzo della GPU invece della CPU.