920 resultados para distributional equivalence
Resumo:
Indicators of gender inequality, poverty and human development in Kenya are examined. Significant and rising incidence of absolute poverty occurs in Kenya and women are more likely to be in poverty than men. Female/male ratios in Kenyan decision-making institutions are highly skewed against women and they experience unfavourable enrolment ratios in primary, secondary and tertiary institutions. The share of income earned by women is much lower than men's share. General Kenyan indicators highlight declining GDP per capita, increased poverty rates especially for women, reduced life expectancy, a narrowing of the difference in female/male life expectancy rates, increased child mortality rates and an increase in the female child mortality rates. This deterioration results in an increased socio-economic burden on women, not adequately captured in the HPI, HDI, GDI and GEM. This paper advocates the use of household level gender disaggregated data because much gender inequality occurs in and emanates from the household level where culture plays a very important role in allocation of resources and decision-making. Because most human development indicators are aggregates or averages, they can be misleading. They need to be supplemented by distributional and disaggregated data as demonstrated in the Kenyan case. The importance is emphasised of studying coping mechanisms of household/families for dealing with economic hardship and other misfortunes, such AIDS.
Resumo:
This four-experiment series sought to evaluate the potential of children with neurosensory deafness and cochlear implants to exhibit auditory-visual and visual-visual stimulus equivalence relations within a matching-to-sample format. Twelve children who became deaf prior to acquiring language (prelingual) and four who became deaf afterwards (postlingual) were studied. All children learned auditory-visual conditional discriminations and nearly all showed emergent equivalence relations. Naming tests, conducted with a subset of the: children, showed no consistent relationship to the equivalence-test outcomes.. This study makes several contributions: to the literature on stimulus equivalence. First; it demonstrates that both pre- and postlingually deaf children-can: acquire auditory-visual equivalence-relations after cochlear implantation, thus demonstrating symbolic functioning. Second, it directs attention to a population that may be especially interesting for researchers seeking to analyze the relationship. between speaker and listener repertoires. Third, it demonstrates the feasibility of conducting experimental studies of stimulus control processes within the limitations of a hospital, which these children must visit routinely for the maintenance of their cochlear implants.
Resumo:
In this work, we present a systematic approach to the representation of modelling assumptions. Modelling assumptions form the fundamental basis for the mathematical description of a process system. These assumptions can be translated into either additional mathematical relationships or constraints between model variables, equations, balance volumes or parameters. In order to analyse the effect of modelling assumptions in a formal, rigorous way, a syntax of modelling assumptions has been defined. The smallest indivisible syntactical element, the so called assumption atom has been identified as a triplet. With this syntax a modelling assumption can be described as an elementary assumption, i.e. an assumption consisting of only an assumption atom or a composite assumption consisting of a conjunction of elementary assumptions. The above syntax of modelling assumptions enables us to represent modelling assumptions as transformations acting on the set of model equations. The notion of syntactical correctness and semantical consistency of sets of modelling assumptions is defined and necessary conditions for checking them are given. These transformations can be used in several ways and their implications can be analysed by formal methods. The modelling assumptions define model hierarchies. That is, a series of model families each belonging to a particular equivalence class. These model equivalence classes can be related to primal assumptions regarding the definition of mass, energy and momentum balance volumes and to secondary and tiertinary assumptions regarding the presence or absence and the form of mechanisms within the system. Within equivalence classes, there are many model members, these being related to algebraic model transformations for the particular model. We show how these model hierarchies are driven by the underlying assumption structure and indicate some implications on system dynamics and complexity issues. (C) 2001 Elsevier Science Ltd. All rights reserved.
Resumo:
Objective-To test the hypothesis that telemedicine for new patient referrals to neurological outpatients is as efficient and acceptable as conventional face to face consultation. Methods-A randomised controlled trial between two groups: face to face (FF) and telemedicine (TM). This study was carried out between a neurological centre and outlying clinics at two distant hospitals linked by identical medium cost commercial interactive video conferencing equipment with ISDN lines transmitting information at 384 kbits/s. The same two neurologists carried out both arms of the study. Of the 168 patients who were suitable for the study, 86 were randomised into the telemedicine group and 82 into the face to face group. Outcome measures were (I) consultation process: (a) number of investigations; (b) number of drugs prescribed; (c) number of patient reviews and (2) patient satisfaction: (a) confidence in consultation; (b) technical aspects of consultation; (c) aspects surrounding confidentiality. Diagnostic categories were also measured to check equivalence between the groups: these were structural neurological, structural non-neurological, nonstructural, and uncertain. Results-Diagnostic categories were similar (p>0.5) between the two groups. Patients in the telemedicine group had significantly more investigations (p=0.001). There was no difference in the number of drugs prescribed (p>0.5). Patients were generally satisfied with both types of consultation process except for concerns about confidentiality and embarrassment in the telemedicine group (p=0.017 and p=0.005 respectively). Conclusion-Telemedicine for new neurological outpatients is possible and feasible but generates more investigations and is less well accepted than face to face examination.
Resumo:
Genetic research on risk of alcohol, tobacco or drug dependence must make allowance for the partial overlap of risk-factors for initiation of use, and risk-factors for dependence or other outcomes in users. Except in the extreme cases where genetic and environmental risk-factors for initiation and dependence overlap completely or are uncorrelated, there is no consensus about how best to estimate the magnitude of genetic or environmental correlations between Initiation and Dependence in twin and family data. We explore by computer simulation the biases to estimates of genetic and environmental parameters caused by model misspecification when Initiation can only be defined as a binary variable. For plausible simulated parameter values, the two-stage genetic models that we consider yield estimates of genetic and environmental variances for Dependence that, although biased, are not very discrepant from the true values. However, estimates of genetic (or environmental) correlations between Initiation and Dependence may be seriously biased, and may differ markedly under different two-stage models. Such estimates may have little credibility unless external data favor selection of one particular model. These problems can be avoided if Initiation can be assessed as a multiple-category variable (e.g. never versus early-onset versus later onset user), with at least two categories measurable in users at risk for dependence. Under these conditions, under certain distributional assumptions., recovery of simulated genetic and environmental correlations becomes possible, Illustrative application of the model to Australian twin data on smoking confirmed substantial heritability of smoking persistence (42%) with minimal overlap with genetic influences on initiation.
Resumo:
Comparative phylogeography has proved useful for investigating biological responses to past climate change and is strongest when combined with extrinsic hypotheses derived from the fossil record or geology. However, the rarity of species with sufficient, spatially explicit fossil evidence restricts the application of this method. Here, we develop an alternative approach in which spatial models of predicted species distributions under serial paleoclimates are compared with a molecular phylogeography, in this case for a snail endemic to the rainforests of North Queensland, Australia. We also compare the phylogeography of the snail to those from several endemic vertebrates and use consilience across all of these approaches to enhance biogeographical inference for this rainforest fauna. The snail mtDNA phylogeography is consistent with predictions from paleoclimate modeling in relation to the location and size of climatic refugia through the late Pleistocene-Holocene and broad patterns of extinction and recolonization. There is general agreement between quantitative estimates of population expansion from sequence data (using likelihood and coalescent methods) vs. distributional modeling. The snail phylogeography represents a composite of both common and idiosyncratic patterns seen among vertebrates, reflecting the geographically finer scale of persistence and subdivision in the snail. In general, this multifaceted approach, combining spatially explicit paleoclimatological models and comparative phylogeography, provides a powerful approach to locating historical refugia and understanding species' responses to them.
Resumo:
Mangroves are often described as a group of plants with common features and common origins based mostly on their broad distributional patterns, together with an erroneous view of comparable abilities in long-distance dispersal. However, whilst mangroves have common needs to adapt to rigorous environmental constraints associated with regular seawater inundation, individual taxa have developed different strategies and characteristics. Since mangroves are a genetically diverse group of mostly flowering plants, they may also have evolved at quite different geological periods, dispersed at different rates from different locations and developed different adaptive strategies. Current distributions of individual taxa show numerous instances of unusual extant distribution which demonstrate finite dispersal limitations, especially across open water. Our preliminary assessment of broad distribution and discontinuities reveals important patterns. Discontinuities, in the absence of current dispersal barriers, may be explained by persistent past barriers. As we learn more about discontinuities, we are beginning to appreciate their immense implications and what they might tell us about past geological conditions and how these might have influenced the distribution and evolution of mangroves. In this article, we describe emerging patterns in genetic relationships and distributions based on both current knowledge and preliminary results of our studies of molecular and morphometric characteristics of Rhizophora species in the Indo West Pacific region.
Resumo:
Program compilation can be formally defined as a sequence of equivalence-preserving transformations, or refinements, from high-level language programs to assembler code, Recent models also incorporate timing properties, but the resulting formalisms are intimidatingly complex. Here we take advantage of a new, simple model of real-time refinement, based on predicate transformer semantics, to present a straightforward compilation formalism that incorporates real-time constraints. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Objective. Outcome assessment in clinical trials using the Western Ontario and McMaster University (WOMAC 3.0) Osteoarthritis Index is traditionally achieved through self-administration of the Index. However, in other areas of clinical measurement, telephone administration has been shown to be a reliable method of acquiring data that are both accurate and complete. To address this issue in knee osteoarthritis (OA), we conducted a comparative study of telephone administration by interviewer of WOMAC LK3.0 versus onsite self-completion at the hospital. Methods. Fifty consenting patients with knee OA were randomized to complete the WOMAC LK3.0 Index by telephone interview one day, followed by onsite completion the following day, or vice versa. Neither patients nor interviewers had access to any prior scores. Results. The mean age of the 50 patients was 66.3 years (range 44-82); 34 (68%) were female and 16 (32%) male. There was excellent agreement between the mean office and telephone scores, with mean differences for the WOMAC LK3.0 pain, stiffness, and function subscale scores and total score of 0.09, 0.12, 0.78, and 0.98, respectively. These differences were well within the respective protocol defined equivalence criteria of +/- 1.7, +/- 0.9, +/- 6.4, and +/- 9.1, and represented differences from office scores of 0.9, 2.6, 2.4, and 2.2%, respectively. Conclusion. The use of telephone interviews for the WOMAC LK3.0 Index is a valid method of obtaining OA outcome measurements. These observations have important implications for designing data acquisition strategies for future OA clinical trials and for longterm observational studies.
Resumo:
The retinoid orphan-related receptor-alpha (RORalpha) is a member of the ROR subfamily of orphan receptors and acts as a constitutive activator of transcription in the absence of exogenous ligands. To understand the basis of this activity, we constructed a homology model of Rill using the closely related TRbeta as a template. Molecular modeling suggested that bulky hydrophobic side chains occupy the RORa ligand cavity leaving a small but distinct cavity that may be involved in receptor stabilization. This model was subject to docking simulation with a receptor-interacting peptide from the steroid receptor coactivator, GR-interacting protein-1, which delineated a coactivator binding surface consisting of the signature motif spanning helices 3-5 and helix 12 [activation function 2 (AF2)]. Probing this surface with scanning alanine mutagenesis showed structural and functional equivalence between homologous residues of RORalpha and TRbeta. This was surprising (given that Rill is a ligand-independent activator, whereas TRbeta has an absolute requirement for ligand) and prompted us to use molecular modeling to identify differences between Rill and TRbeta in the way that the All helix interacts with the rest of the receptor. Modeling highlighted a nonconserved amino acid in helix 11 of RORa (Phe491) and a short-length of 3.10 helix at the N terminus of AF2 which we suggest i) ensures that AF2 is locked permanently in the holoconformation described for other liganded receptors and thus 2) enables ligand-independent recruitment of coactivators. Consistent with this, mutation of RORa Phe491 to either methionine or alanine (methionine is the homologous residue in TRbeta), reduced and ablated transcriptional activation and recruitment of coactivators, respectively. Furthermore, we were able to reconstitute transcriptional activity for both a deletion mutant of Ill lacking All and Phe491 Met, by overexpression of a GAL-AF2 fusion protein, demonstrating ligand-independent recruitment of AF2 and a role for Phe491 in recruiting AF2.
Resumo:
Let X and Y be Hausdorff topological vector spaces, K a nonempty, closed, and convex subset of X, C: K--> 2(Y) a point-to-set mapping such that for any x is an element of K, C(x) is a pointed, closed, and convex cone in Y and int C(x) not equal 0. Given a mapping g : K --> K and a vector valued bifunction f : K x K - Y, we consider the implicit vector equilibrium problem (IVEP) of finding x* is an element of K such that f (g(x*), y) is not an element of - int C(x) for all y is an element of K. This problem generalizes the (scalar) implicit equilibrium problem and implicit variational inequality problem. We propose the dual of the implicit vector equilibrium problem (DIVEP) and establish the equivalence between (IVEP) and (DIVEP) under certain assumptions. Also, we give characterizations of the set of solutions for (IVP) in case of nonmonotonicity, weak C-pseudomonotonicity, C-pseudomonotonicity, and strict C-pseudomonotonicity, respectively. Under these assumptions, we conclude that the sets of solutions are nonempty, closed, and convex. Finally, we give some applications of (IVEP) to vector variational inequality problems and vector optimization problems. (C) 2003 Elsevier Science Ltd. All rights reserved.
Resumo:
The influence of near-bed sorting processes on heavy mineral content in suspension is discussed. Sediment concentrations above a rippled bed of mixed quartz and heavy mineral sand were measured under regular nonbreaking waves in the laboratory. Using the traditional gradient diffusion process, settling velocity would be expected to strongly affect sediment distribution. This was not observed during present trials. In fact, the vertical gradients of time-averaged suspension concentrations were found to be similar for the light and heavy minerals, despite their different settling velocities. This behavior implies a convective rather than diffusive distribution mechanism. Between the nonmoving bed and the lowest suspension sampling point, fight and heavy mineral concentration differs by two orders of magnitude. This discrimination against the heavy minerals in the pickup process is due largely to selective entrainment at the ripple face. Bed-form dynamics and the nature of quartz suspension profiles are found to be little affected by the trialed proportion of overall heavy minerals in the bed (3.8-22.1%).
Resumo:
Este trabalho apresenta os conceitos de tradução de forma ampla, rompendo com a concepção de transformação de uma mensagem em um determinado código linguístico para outro, na qual busca-se apenas uma equivalência literal. Objetiva-se discutir a partir de teóricos em torno das teorias pós-coloniais e desconstrucionistas como Bhabha (2010), Hall (2006), Derrida (2006), Ottoni (2005), Orlandi (2008) e Niranjana (2011), a possibilidade de reposicionar a tradução, entendendo-a como ferramenta capaz de desconstruir paradigmas dominantes e, assim, recontar histórias, constituindo-se pontes plásticas. Assim, buscar-se-á pensar a tradução, na sua plasticidade, promovendo pontes que manifestam os cruzamentos existentes entre línguas, evidenciando o homem como cruzamento de diversos sujeitos. Como objeto empírico de análise, utilizaremos três verbetes presentes no site Wikipédia, nos quais traços da identidade brasileira estão envolvidos e investigar-se-á, a partir da leitura estereoscópica e da Teoria da Relevância proposta por Speber & Wilson (2001), os nuances nos processos tradutórios que podem contribuir para a forma como o Brasil é representado tanto pelas comunidades locais quanto estrangeiras, levando à constituição de estereótipos. A pesquisa aqui proposta se mostra relevante, uma vez que, objetiva estudar um corpus pouco explorado academicamente, mas em voga socialmente, já que o site Wikipédia se encontra entre os dez mais acessados da atualidade, com mais de 300 milhões de acessos únicos, além da importância de ser uma fonte de construção coletiva do pensamento, reinventando o conceito de enciclopédia. A pesquisa também se torna relevante sob a ótica dos estudos da tradução já que discorre sobre a importância do tradutor como ferramenta de formação de identidades sejam nacionais, culturais e sociais, além de propor o conceito híbrido de tradução-resenha, como uma tendência ao mundo multilingual contemporâneo
Resumo:
No dia a dia, as pessoas sabem que existem várias aplicações financeiras, mas a grande maioria não possui o conhecimento necessário para escolher qual o melhor investimento para o seu dinheiro. E assim, muitas vezes acabam investindo somente na poupança, talvez pela sua simplicidade ou por sua popularidade. Neste trabalho, buscamos mostrar a importância da Matemática Financeira para o entendimento de investimentos. Através de uma abordagem de Progressões Geométrica e Aritmética e, consequentemente, da Matemática Financeira, são apresentadas algumas aplicações financeiras, com seus conceitos, cálculos e atividades que mostram como compará-las. Assim, no trabalho são descritos conceitos e propriedades das Progressões Geométrica e Aritmética, e da Matemática Financeira e, como aplicação desse conteúdo, são apresentados o conceito e o cálculo de diversas taxas (equivalência de taxas, taxas efetiva e nominal, taxa pré e pós-fixada, taxas variáveis, taxa referencial), além de alguns investimentos financeiros (poupança, CDB, LCI). Buscando consolidar esse estudo desenvolvido, apresentamos no final do trabalho uma proposta de atividades a serem realizadas em sala de aula, que contemplam cálculos da rentabilidade dos investimentos, mostrando ao aluno como calcular e comparar esses rendimentos.
Resumo:
INTRODUCTION: Morbidity information is easily available from medical records but its scope is limited to the population attended by the health services. Information on the prevalence of diseases requires community surveys, which are not always feasible. These two sources of information represent two alternative assessments of disease occurrence, namely demand morbidity and perceived morbidity. The present study was conceived so as to elicit a potential relationship between them so that the former could be used in the absence of the latter. METHODS: A community of 13,365 families on the outskirts of S. Paulo, Brazil, was studied during the period from 15/Nov/1994 to 15/Jan/1995. Data regarding children less than 5 years old were collected from a household survey and from the 2 basic health units in the area. Prevalence of diseases was ascertained from perceived morbidity and compared to estimates computed from demand morbidity. RESULTS: Data analysis distinguished 2 age groups, infants less than 1 year old and children 1 to less than 5. The most important groups of diseases were respiratory diseases, diarrhoea, skin problems and infectious & parasitical diseases. Basic health units presented a better coverage for infants. Though disease frequencies were not different within or outside these units, a better coverage was found for diarrhoea and infectious & parasitical diseases in the infant group, and for diarrhoea in the older age group. Equivalence between the two types of morbidity was found to be limited to the infant group and concerned only the best covered diseases. The odds of a disease being seen at the health service should be of at least 4:10 to ensure this equivalence. CONCLUSION: It was concluded that, provided that health service coverage is good, demand morbidity can be taken as a reliable estimate of community morbidity.