976 resultados para Technical Report, Computer Science
Resumo:
We extend all elementary functions from the real to the transreal domain so that they are defined on division by zero. Our method applies to a much wider class of functions so may be of general interest.
Resumo:
The notion that learning can be enhanced when a teaching approach matches a learner’s learning style has been widely accepted in classroom settings since the latter represents a predictor of student’s attitude and preferences. As such, the traditional approach of ‘one-size-fits-all’ as may be applied to teaching delivery in Educational Hypermedia Systems (EHSs) has to be changed with an approach that responds to users’ needs by exploiting their individual differences. However, establishing and implementing reliable approaches for matching the teaching delivery and modalities to learning styles still represents an innovation challenge which has to be tackled. In this paper, seventy six studies are objectively analysed for several goals. In order to reveal the value of integrating learning styles in EHSs, different perspectives in this context are discussed. Identifying the most effective learning style models as incorporated within AEHSs. Investigating the effectiveness of different approaches for modelling students’ individual learning traits is another goal of this study. Thus, the paper highlights a number of theoretical and technical issues of LS-BAEHSs to serve as a comprehensive guidance for researchers who interest in this area.
Resumo:
Transreal numbers provide a total semantics containing classical truth values, dialetheaic, fuzzy and gap values. A paraconsistent Sheffer Stroke generalises all classical logics to a paraconsistent form. We introduce logical spaces of all possible worlds and all propositions. We operate on a proposition, in all possible worlds, at the same time. We define logical transformations, possibility and necessity relations, in proposition space, and give a criterion to determine whether a proposition is classical. We show that proofs, based on the conditional, infer gaps only from gaps and that negative and positive infinity operate as bottom and top values.
Resumo:
Nonlinear data assimilation is high on the agenda in all fields of the geosciences as with ever increasing model resolution and inclusion of more physical (biological etc.) processes, and more complex observation operators the data-assimilation problem becomes more and more nonlinear. The suitability of particle filters to solve the nonlinear data assimilation problem in high-dimensional geophysical problems will be discussed. Several existing and new schemes will be presented and it is shown that at least one of them, the Equivalent-Weights Particle Filter, does indeed beat the curse of dimensionality and provides a way forward to solve the problem of nonlinear data assimilation in high-dimensional systems.
Resumo:
This paper presents an approach for assisting low-literacy readers in accessing Web online information. The oEducational FACILITAo tool is a Web content adaptation tool that provides innovative features and follows more intuitive interaction models regarding accessibility concerns. Especially, we propose an interaction model and a Web application that explore the natural language processing tasks of lexical elaboration and named entity labeling for improving Web accessibility. We report on the results obtained from a pilot study on usability analysis carried out with low-literacy users. The preliminary results show that oEducational FACILITAo improves the comprehension of text elements, although the assistance mechanisms might also confuse users when word sense ambiguity is introduced, by gathering, for a complex word, a list of synonyms with multiple meanings. This fact evokes a future solution in which the correct sense for a complex word in a sentence is identified, solving this pervasive characteristic of natural languages. The pilot study also identified that experienced computer users find the tool to be more useful than novice computer users do.
Resumo:
In this paper, we proposed a new two-parameter lifetime distribution with increasing failure rate, the complementary exponential geometric distribution, which is complementary to the exponential geometric model proposed by Adamidis and Loukas (1998). The new distribution arises on a latent complementary risks scenario, in which the lifetime associated with a particular risk is not observable; rather, we observe only the maximum lifetime value among all risks. The properties of the proposed distribution are discussed, including a formal proof of its probability density function and explicit algebraic formulas for its reliability and failure rate functions, moments, including the mean and variance, variation coefficient, and modal value. The parameter estimation is based on the usual maximum likelihood approach. We report the results of a misspecification simulation study performed in order to assess the extent of misspecification errors when testing the exponential geometric distribution against our complementary one in the presence of different sample size and censoring percentage. The methodology is illustrated on four real datasets; we also make a comparison between both modeling approaches. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Mixed linear models are commonly used in repeated measures studies. They account for the dependence amongst observations obtained from the same experimental unit. Often, the number of observations is small, and it is thus important to use inference strategies that incorporate small sample corrections. In this paper, we develop modified versions of the likelihood ratio test for fixed effects inference in mixed linear models. In particular, we derive a Bartlett correction to such a test, and also to a test obtained from a modified profile likelihood function. Our results generalize those in [Zucker, D.M., Lieberman, O., Manor, O., 2000. Improved small sample inference in the mixed linear model: Bartlett correction and adjusted likelihood. Journal of the Royal Statistical Society B, 62,827-838] by allowing the parameter of interest to be vector-valued. Additionally, our Bartlett corrections allow for random effects nonlinear covariance matrix structure. We report simulation results which show that the proposed tests display superior finite sample behavior relative to the standard likelihood ratio test. An application is also presented and discussed. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
The main objective of this paper is to study a logarithm extension of the bimodal skew normal model introduced by Elal-Olivero et al. [1]. The model can then be seen as an alternative to the log-normal model typically used for fitting positive data. We study some basic properties such as the distribution function and moments, and discuss maximum likelihood for parameter estimation. We report results of an application to a real data set related to nickel concentration in soil samples. Model fitting comparison with several alternative models indicates that the model proposed presents the best fit and so it can be quite useful in real applications for chemical data on substance concentration. Copyright (C) 2011 John Wiley & Sons, Ltd.
Resumo:
We present a Bayesian approach for modeling heterogeneous data and estimate multimodal densities using mixtures of Skew Student-t-Normal distributions [Gomez, H.W., Venegas, O., Bolfarine, H., 2007. Skew-symmetric distributions generated by the distribution function of the normal distribution. Environmetrics 18, 395-407]. A stochastic representation that is useful for implementing a MCMC-type algorithm and results about existence of posterior moments are obtained. Marginal likelihood approximations are obtained, in order to compare mixture models with different number of component densities. Data sets concerning the Gross Domestic Product per capita (Human Development Report) and body mass index (National Health and Nutrition Examination Survey), previously studied in the related literature, are analyzed. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
In this paper, we present a Bayesian approach for estimation in the skew-normal calibration model, as well as the conditional posterior distributions which are useful for implementing the Gibbs sampler. Data transformation is thus avoided by using the methodology proposed. Model fitting is implemented by proposing the asymmetric deviance information criterion, ADIC, a modification of the ordinary DIC. We also report an application of the model studied by using a real data set, related to the relationship between the resistance and the elasticity of a sample of concrete beams. Copyright (C) 2008 John Wiley & Sons, Ltd.
Resumo:
Chagas disease, caused by the protozoan Trypanosoma cruzi, is one of the most serious amongst the so-called neglected diseases in Latin America, specially in Brazil. So far there has been no effective treatment for the chronic phase of this disease. Cruzain is a major cysteine protease of T cruzi and it is recognized as a valid target for Chagas disease chemotherapy. The mechanism of cruzain action is associated with the nucleophilic attack of an activated sulfur atom towards electrophilic groups. In this report, features of a putative pharmacophore model of the enzyme, developed as a virtual screening tool for the selection of potential cruzain inhibitors, are described. The final proposed model was applied to the ZINC v.7 database and afterwards experimentally validated by an enzymatic inhibition assay. One of the compounds selected by the model showed cruzain inhibition in the low micromolar range.
Resumo:
Este estudo teve como objetivo verificar até que ponto o processo de descentralização adotado pelo Departamento de Polícia Técnica da Bahia (DPT-BA) foi eficiente no atendimento às demandas de perícias de Computação Forense geradas pelas Coordenadorias Regionais de Polícia Técnica (CRPTs) do interior do Estado. O DPT-BA foi reestruturado obedecendo aos princípios da descentralização administrativa, seguindo a corrente progressista. Assumiu, com a descentralização, o compromisso de coordenar ações para dar autonomia às unidades do interior do Estado, com a criação de estruturas mínimas em todas as esferas envolvidas, com ampla capacidade de articulação entre si e com prestação de serviços voltados para um modelo de organização pública de alto desempenho. Ao abordar a relação existente entre a descentralização e a eficiência no atendimento à demanda de perícias oriundas do interior do estado da Bahia, o estudo, por limitações instrumentais, se manteve adstrito ao campo das perícias de Computação Forense, que reflete e ilustra, de forma expressiva, o cenário ocorrido nas demais áreas periciais. Inicialmente foram identificadas as abordagens teóricas sobre descentralização, evidenciando as distintas dimensões do conceito, e, em seguida, sobre a Computação Forense. Foram realizadas pesquisa documental no Instituto de Criminalística Afrânio Peixoto (Icap) e pesquisa de campo por meio de entrevistas semiestruturadas com juízes de direito lotados nas varas criminais de comarcas relacionadas ao cenário de pesquisa e com peritos criminais das Coordenações Regionais, das CRPTs e da Coordenação de Computação Forense do Icap. Correlacionando os prazos de atendimento que contemplam o conceito de eficiência definido pelos juízes de direito entrevistados, clientes finais do trabalho pericial e os prazos reais obtidos mediante a pesquisa documental os dados revelaram alto grau de ineficiência, morosidade e inadimplência, além de realidades discrepantes entre capital e interior. A análise das entrevistas realizadas com os peritos criminais revelou um cenário de insatisfação e desmotivação generalizadas, com a centralização quase absoluta do poder decisório, demonstrando que o processo de descentralização praticado serviu, paradoxalmente, como uma ferramenta de viabilização e camuflagem da centralização.
Resumo:
Brazilian Portuguese needs a Wordnet that is open access, downloadable and changeable, so that it can be improved by the community interested in using it for knowledge representation and automated deduction. This kind of resource is also very valuable to linguists and computer scientists interested in extracting and representing knowledge obtained from texts. We discuss briefly the reasons for a Brazilian Portuguese Wordnet and the process we used to get a preliminary version of such a resource. Then we discuss possible steps to improving our preliminary version.
Resumo:
Diante de um novo perfil do emprego e do mercado de trabalho - que se transforma marcadamente em face das tecnologias de informação e comunicação (TIC) há uma demanda por profissionais munidos de novas habilidades e competências. Assim, objetiva-se, de modo geral, analisar as novas habilidades demandadas pelo mercado de trabalho atual para o profissional da informação. Para tanto, busca-se especificamente, caracterizar os fatores determinantes do contexto atual; verificar o impacto das TIC no mercado de trabalho do profissional da informação; e conhecer as transformações ocorridas no perfil deste profissional frente a estas mudanças. Para alcançar os objetivos propostos recorreu-se à pesquisa bibliográfica, bem como à análise de anúncios de empregos divulgados no website Catho On-line. A análise dos dados se deu de forma comparativa entre os anos de 2003 e 2005 e, permitiu concluir que as habilidades exigidas para o profissional da informação na atualidade dizem respeito, além dos conhecimentos técnicos, a fluência em idioma estrangeiro, ao domínio da informática, aos conhecimentos gerencias e, principalmente, às habilidades interpessoais_____________________________________________________________________________Facing a new profile of the job and the labor market - that changes remarkably regarding to the new technologies of information and communication (TIC), there is a demand for professionals who posses new abilities and competences. It was aimed at, then, in general, to analyze the new abilities disputed by the labor market for the professional of information, because of the new available resources. In order to do that, it was specifically searched, to characterize the decisive factors of the current context; to verify the impact of the TIC in the labor market of the professional of information; and to know the transformations happened in the profile of these professionals before these changes. To reach the proposed objectives it was realized bibliographical research, as well as to the analysis of advertisements of jobs published in the website Catho On-line, of comparative form between the years of 2003 and 2005. The analysis of the data allowed us to conclude that the abilities demanded for the professional of information at the present time concern, besides the technical knowledge, the fluency in a foreign language, to the domain of the computer science and the managing knowledge