948 resultados para generic finiteness
Resumo:
We calculate the equilibrium thermodynamic properties, percolation threshold, and cluster distribution functions for a model of associating colloids, which consists of hard spherical particles having on their surfaces three short-ranged attractive sites (sticky spots) of two different types, A and B. The thermodynamic properties are calculated using Wertheim's perturbation theory of associating fluids. This also allows us to find the onset of self-assembly, which can be quantified by the maxima of the specific heat at constant volume. The percolation threshold is derived, under the no-loop assumption, for the correlated bond model: In all cases it is two percolated phases that become identical at a critical point, when one exists. Finally, the cluster size distributions are calculated by mapping the model onto an effective model, characterized by a-state-dependent-functionality (f) over bar and unique bonding probability (p) over bar. The mapping is based on the asymptotic limit of the cluster distributions functions of the generic model and the effective parameters are defined through the requirement that the equilibrium cluster distributions of the true and effective models have the same number-averaged and weight-averaged sizes at all densities and temperatures. We also study the model numerically in the case where BB interactions are missing. In this limit, AB bonds either provide branching between A-chains (Y-junctions) if epsilon(AB)/epsilon(AA) is small, or drive the formation of a hyperbranched polymer if epsilon(AB)/epsilon(AA) is large. We find that the theoretical predictions describe quite accurately the numerical data, especially in the region where Y-junctions are present. There is fairly good agreement between theoretical and numerical results both for the thermodynamic (number of bonds and phase coexistence) and the connectivity properties of the model (cluster size distributions and percolation locus).
Resumo:
We provide all agent; the capability to infer the relations (assertions) entailed by the rules that, describe the formal semantics of art RDFS knowledge-base. The proposed inferencing process formulates each semantic restriction as a rule implemented within a, SPARQL query statement. The process expands the original RDF graph into a fuller graph that. explicitly captures the rule's described semantics. The approach is currently being explored in order to support descriptions that follow the generic Semantic Web Rule Language. An experiment, using the Fire-Brigade domain, a small-scale knowledge-base, is adopted to illustrate the agent modeling method and the inferencing process.
Resumo:
A esclerose múltipla (EM) é a doença crónica neurológica que mais afeta adultos jovens; em 80% dos casos, a doença progride para situações de níveis variados de incapacidade, o que torna necessário avaliar a qualidade de vida (QV) desses indivíduos. O objetivo desta revisão foi localizar estudos que avaliam a QV em indivíduos com EM, identificando os instrumentos utilizados e suas características psicométricas. Foram consultadas as bases Psycinfo, Psycarticles, Psycbooks, Psychology & Behavioral Science Collection, EJS E-Journal, Cochrane Central Register of Controlled Trials, Cochrane Database of Systematic Reviews, Database of Abstracts of Reviews of Effects, Medline, e Academic Search Complete, utilizando os descritores 'multiple sclerosis' e 'quality of life', para localizar artigos publicados no período 1997-2007. Foram selecionados 1.376 artigos e, após a leitura dos resumos, excluídos os referentes a instrumentos que não tinham boas características psicométricas e/ou eram pouco referenciados. Foram encontrados 461 artigos, dos quais 267 usaram instrumentos genéricos e 194 específicos para a EM. Dos 7 instrumentos (2 genéricos, 5 específicos) com boas características psicométricas utilizados pelos estudos consultados, o mais usado é o SF-36 (em 237 estudos). Todos os instrumentos têm validade verificada e apresentam grau elevado de confiabilidade, podendo ser utilizados para avaliação da qualidade de vida de pacientes com EM tanto em pesquisa quanto na clínica. ABSTRACT - Multiple sclerosis (MS) is the chronic neurological disease that most affects young adults; 80% of patients experience a transition towards persistent disability, hence the need to assess their quality of life (QoL). The aim of the study was to review studies that assess QoL in patients with multiple sclerosis, inquiring on the instruments used and their psychometric features. Articles published from 1997 through 2007 were searched for by means of key words 'multiple sclerosis' and 'quality of life' in databases Psycinfo, Psycarticles, Psycbooks, Psychology & Behavioral Science Collection, EJS E-Journal, Cochrane Central Register of Controlled Trials, Cochrane Database of Systematic Reviews, Database of Abstracts of Reviews of Effects, Medline, and Academic Search Complete. From the 1,376 studies found, after abstract reading those that reported on instruments with poor psychometric properties and/or were little referred were excluded. A total of 461 articles were selected, of which 267 reported using generic instruments and 194, MS-specific ones. Among the 7 instruments reported by the studies as having good psychometric characteristics (2 generic, five MS-specific), the most used is the SF-36 (by 237 studies). All instruments have shown adequate psychometric properties and a high degree of reliability, hence may be used to assess QoL in subjects with multiple sclerosis both in clinic and research.
Resumo:
HENRE II (Higher Education Network for Radiography in Europe)
Resumo:
O trabalho apresentado por este documento aborda os problemas que advêm da necessidade de integração de aplicações, desenvolvidas em diferentes instantes no tempo, por diferentes equipas de trabalho, que para enriquecer os processos de negócio necessitam de comunicar entre si. A integração das aplicações tem de ser feita de forma opaca para estas, sendo disponibilizada por uma peça de software genérica, robusta e sem custos para as equipas desenvolvimento, na altura da integração. Esta integração tem de permitir que as aplicações comuniquem utilizando os protocolos que desejarem. Este trabalho propõe um middleware orientado a mensagens como solução para o problema identificado. A solução apresentada por este trabalho disponibiliza a comunicação entre aplicações que utilizam diferentes protocolos, permite ainda o desacoplamento temporal, espacial e de sincronismo na comunicação das aplicações. A implementação da solução tem base num sistema publish/subscribe orientado ao conteúdo e tem de lidar com as maiores exigências computacionais que este tipo de sistema acarta, sendo que a utilização deste se justifica com o enriquecimento da semântica de subscrição de eventos. Esta implementação utiliza uma arquitectura semi-distribuída, com o objectivo de aumentar a escalabilidade do sistema. A utilização da arquitectura semi-distribuída implica que a implementação da solução tem de lidar com o encaminhamento de eventos e divulgação das subscrições, pelos vários servidores de eventos. A implementação da solução disponibiliza garantias de persistência, processamento transaccional e tolerância a falhas, assim como transformação de eventos entre os diversos protocolos. A extensibilidade da solução é conseguida à custa de um sistema de pluggins que permite a adição de suporte a novos protocolos de comunicação. Os protocolos suportados pela implementação final do trabalho são RestMS e TCP.
Resumo:
Tribimaximal leptonic mixing is a mass-independent mixing scheme consistent with the present solar and atmospheric neutrino data. By conveniently decomposing the effective neutrino mass matrix associated to it, we derive generic predictions in terms of the parameters governing the neutrino masses. We extend this phenomenological analysis to other mass-independent mixing schemes which are related to the tribimaximal form by a unitary transformation. We classify models that produce tribimaximal leptonic mixing through the group structure of their family symmetries in order to point out that there is often a direct connection between the group structure and the phenomenological analysis. The type of seesaw mechanism responsible for neutrino masses plays a role here, as it restricts the choices of family representations and affects the viability of leptogenesis. We also present a recipe to generalize a given tribimaximal model to an associated model with a different mass-independent mixing scheme, which preserves the connection between the group structure and phenomenology as in the original model. This procedure is explicitly illustrated by constructing toy models with the transpose tribimaximal, bimaximal, golden ratio, and hexagonal leptonic mixing patterns.
Resumo:
The main aims of this work are the development and the validation of one generic algorithm to provide the optimal control of small power wind generators. That means up to 40 kW and blades with fixed pitch angle. This algorithm allows the development of controllers to fetch the wind generators at the desired operational point in variable operating conditions. The problems posed by the variable wind intensity are solved using the proposed algorithm. This is done with no explicit measure of the wind velocity, and so no special equipment or anemometer is required to compute or measure the wind velocity.
Resumo:
This paper describes an architecture conceived to integrate Power Sys-tems tools in a Power System Control Centre, based on an Ambient Intelligent (AmI) paradigm. This architecture is an instantiation of the generic architecture proposed in [1] for developing systems that interact with AmI environments. This architecture has been proposed as a consequence of a methodology for the inclu-sion of Artificial Intelligence in AmI environments (ISyRAmI - Intelligent Sys-tems Research for Ambient Intelligence). The architecture presented in the paper will be able to integrate two applications in the control room of a power system transmission network. The first is SPARSE expert system, used to get diagnosis of incidents and to support power restoration. The second application is an Intelligent Tutoring System (ITS) incorporating two training tools. The first tutoring tool is used to train operators to get the diagnosis of incidents. The second one is another tutoring tool used to train operators to perform restoration procedures.
Resumo:
OBJECTIVE: Pharmaceutical assistance is essential in health care and a right of citizens according to Brazilian law and drug policies. The study purpose was to evaluate aspects of pharmaceutical assistance in public primary health care. METHODS: A cross-sectional study using WHO drug indicators was carried out in Brasília in 2001. From a random sample of 15 out of 62 centers thirty exiting patients per center were interviewed. RESULTS: Only 18.7% of the patients fully understood the prescription, 56.3% could read it, 61.2% of the prescribed drugs were actually dispensed, and mean duration of pharmaceutical dispensing was 53.2 seconds. Each visit lasted on average 9.4 minutes. Of prescribed and non-dispensed drugs, 85.3% and 60.6% were on the local essential drug list (EDL) respectively. On average 83.2% of 40 essential drugs were in stock, and only two centers had a pharmacist in charge of the pharmacy. The mean number of drugs per prescription was 2.3, 85.3% of prescribed drugs were on the EDL, 73.2% were prescribed using the generic denomination, 26.4% included antibiotics and 7.5% were injectables. The most prescribed groups were: cardiovascular drugs (26.8%), anti-infective drugs (13.1%), analgesics (8.9%), anti-asthmatic drugs (5.8%), anti-diabetic drugs (5.3%), psychoactive drugs (3.7%), and combination drugs (2.7%). CONCLUSIONS: Essential drugs were only moderately available almost 30 years after the first Brazilian EDL was formulated. While physician use of essential drugs and generic names was fairly high, efficiency was impaired by the poor quality of pharmaceutical care, resulting in very low patient understanding and insufficient guarantee of supply, particularly for chronic diseases.
Resumo:
O sistema de telegestão é uma ferramenta que permite a gestão, em tempo real, de todo o sistema de abastecimento da Empresa Portuguesa das Águas Livres, S.A. (EPAL). Esta gestão pode ser conseguida desde a captação da água até à sua entrega ao cliente final, através dos meios de monitorização necessários às operações de comando que permitem controlar e manobrar à distância os acessórios do sistema (estações elevatórias, reservatórios, válvulas,…). A presente dissertação visa a divulgação e compilação de elementos fundamentais para a otimização das potencialidades que a telegestão oferece, abordando assim, dada a sua especificidade, um tema pouco divulgado mas de extrema importância a quem trabalha ou pretende trabalhar numa entidade gestora similar. Assim, a dissertação é constituída por seis capítulos que compreendem a caracterização do sistema de adução, transporte e distribuição da EPAL, a abordagem genérica das ferramentas de suporte à exploração do sistema, uma resenha histórica do sistema de telegestão na EPAL, bem como informações referentes ao atual sistema de telegestão, nomeadamente a sua arquitetura, principais funcionalidades, tais como o controlo de órgãos de manobra à distância e análise de parâmetros de qualidade em tempo real. Finalmente, apresentam-se algumas conclusões e recomendações para trabalhos futuros. Pretende-se assim que o presente documento contribua para uma aglutinação de informações relativas aos sistemas de telegestão para abastecimento de água, respetivas vantagens aliadas às suas funcionalidades, bem como a identificação de fragilidades do sistema que poderão ser aperfeiçoadas ou mesmo eliminadas.
Resumo:
In the present study we focus on the interaction between the acquisition of new words and text organisation. In the acquisition of new words we emphasise the acquisition of paradigmatic relations such as hyponymy, meronymy and semantic sets. We work with a group of girls attending a private school for adolescents in serious difficulties. The subjects are from disadvantaged families. Their writing skills were very poor. When asked to describe a garden, they write a short text of a single paragraph, the lexical items were generic, there were no adjectives, and all of them use mainly existential verbs. The intervention plan assumed that subjects must to be exposed to new words, working out its meaning. In presence of referents subjects were taught new words making explicit the intended relation of the new term to a term already known. In the classroom subjects were asked to write all the words they knew drawing the relationships among them. They talk about the words specifying the relation making explicit pragmatic directions like is a kind of, is a part of or are all x. After that subjects were exposed to the task of choosing perspective. The work presented in this paper accounts for significant differences in the text of the subjects before and after the intervention. While working new words subjects were organising their lexicon and learning to present a whole entity in perspective.
Resumo:
Devido ao facto de hoje em dia a informação que é processada numa rede informática empresarial, ser cada vez mais de ordem confidencial, torna-se necessário que essa informação esteja o mais protegida possível. Ao mesmo tempo, é necessário que esta a informação esteja disponível com a devida rapidez, para os parceiros certos, num mundo cada vez mais globalizado. Com este trabalho pretende-se efectuar o estudo e implementação da segurança, numa pequena e genérica rede de testes, que facilmente seja extrapolada, para uma rede da dimensão, de uma grande empresa com potenciais ramificações por diversos locais. Pretende-se implementar/monitorização segurança quer externamente, (Internet service provider ISP) quer internamente (activos de rede, postos de trabalho/utilizadores). Esta análise é baseada na localização (local, wireless ou remota), e, sempre que seja detectada qualquer anomalia, seja identificada a sua localização, sendo tomadas automaticamente acções de protecção. Estas anomalias poderão ser geridas recorrendo a ferramentas open source ou comerciais, que façam a recolha de toda a informação necessária, e tomem acções de correcção ou alerta mediante o tipo de anomalia.
Resumo:
We investigate, via numerical simulations, mean field, and density functional theories, the magnetic response of a dipolar hard sphere fluid at low temperatures and densities, in the region of strong association. The proposed parameter-free theory is able to capture both the density and temperature dependence of the ring-chain equilibrium and the contribution to the susceptibility of a chain of generic length. The theory predicts a nonmonotonic temperature dependence of the initial (zero field) magnetic susceptibility, arising from the competition between magnetically inert particle rings and magnetically active chains. Monte Carlo simulation results closely agree with the theoretical findings. DOI: 10.1103/PhysRevLett.110.148306
Resumo:
In the two Higgs doublet model, there is the possibility that the vacuum where the universe resides in is metastable. We present the tree-level bounds on the scalar potential parameters which have to be obeyed to prevent that situation. Analytical expressions for those bounds are shown for the most used potential, that with a softly broken Z(2) symmetry. The impact of those bounds on the model's phenomenology is discussed in detail, as well as the importance of the current LHC results in determining whether the vacuum we live in is or is not stable. We demonstrate how the vacuum stability bounds can be obtained for the most generic CP-conserving potential, and provide a simple method to implement them.
Resumo:
We present the first version of a new tool to scan the parameter space of generic scalar potentials, SCANNERS (Coimbra et al., SCANNERS project., 2013). The main goal of SCANNERS is to help distinguish between different patterns of symmetry breaking for each scalar potential. In this work we use it to investigate the possibility of excluding regions of the phase diagram of several versions of a complex singlet extension of the Standard Model, with future LHC results. We find that if another scalar is found, one can exclude a phase with a dark matter candidate in definite regions of the parameter space, while predicting whether a third scalar to be found must be lighter or heavier. The first version of the code is publicly available and contains various generic core routines for tree level vacuum stability analysis, as well as implementations of collider bounds, dark matter constraints, electroweak precision constraints and tree level unitarity.