995 resultados para Universal model
Resumo:
This thesis presents a universal model of documents and deltas. This model formalize what it means to find differences between documents and to shows a single shared formalization that can be used by any algorithm to describe the differences found between any kind of comparable documents. The main scientific contribution of this thesis is a universal delta model that can be used to represent the changes found by an algorithm. The main part of this model are the formal definition of changes (the pieces of information that records that something has changed), operations (the definitions of the kind of change that happened) and deltas (coherent summaries of what has changed between two documents). The fundamental mechanism tha makes the universal delta model a very expressive tool is the use of encapsulation relations between changes. In the universal delta model, changes are not always simple records of what has changed, they can also be combined into more complex changes that reflects the detection of more meaningful modifications. In addition to the main entities (i.e., changes, operations and deltas), the model describes and defines also documents and the concept of equivalence between documents. As a corollary to the model, there is also an extensible catalog of possible operations that algorithms can detect, used to create a common library of operations, and an UML serialization of the model, useful as a reference when implementing APIs that deal with deltas. The universal delta model presented in this thesis acts as the formal groundwork upon which algorithm can be based and libraries can be implemented. It removes the need to recreate a new delta model and terminology whenever a new algorithm is devised. It also alleviates the problems that toolmakers have when adapting their software to new diff algorithms.
Resumo:
We derive a universal model for atom pairs interacting with non-resonant light via the polarizability anisotropy, based on the long range properties of the scattering. The corresponding dynamics can be obtained using a nodal line technique to solve the asymptotic Schrödinger equation. It consists of imposing physical boundary conditions at long range and vanishing the wavefunction at a position separating the inner zone and the asymptotic region. We show that nodal lines which depend on the intensity of the non-resonant light can satisfactorily account for the effect of the polarizability at short range. The approach allows to determine the resonance structure, energy, width, channel mixing and hybridization even for narrow resonances.
Resumo:
The transcriptome is the readout of the genome. Identifying common features in it across distant species can reveal fundamental principles. To this end, the ENCODE and modENCODE consortia have generated large amounts of matched RNA-sequencing data for human, worm and fly. Uniform processing and comprehensive annotation of these data allow comparison across metazoan phyla, extending beyond earlier within-phylum transcriptome comparisons and revealing ancient, conserved features. Specifically, we discover co-expression modules shared across animals, many of which are enriched in developmental genes. Moreover, we use expression patterns to align the stages in worm and fly development and find a novel pairing between worm embryo and fly pupae, in addition to the embryo-to-embryo and larvae-to-larvae pairings. Furthermore, we find that the extent of non-canonical, non-coding transcription is similar in each organism, per base pair. Finally, we find in all three organisms that the gene-expression levels, both coding and non-coding, can be quantitatively predicted from chromatin features at the promoter using a 'universal model' based on a single set of organism-independent parameters.
Resumo:
In this work the G(A)(0) distribution is assumed as the universal model for amplitude Synthetic Aperture (SAR) imagery data under the Multiplicative Model. The observed data, therefore, is assumed to obey a G(A)(0) (alpha; gamma, n) law, where the parameter n is related to the speckle noise, and (alpha, gamma) are related to the ground truth, giving information about the background. Therefore, maps generated by the estimation of (alpha, gamma) in each coordinate can be used as the input for classification methods. Maximum likelihood estimators are derived and used to form estimated parameter maps. This estimation can be hampered by the presence of corner reflectors, man-made objects used to calibrate SAR images that produce large return values. In order to alleviate this contamination, robust (M) estimators are also derived for the universal model. Gaussian Maximum Likelihood classification is used to obtain maps using hard-to-deal-with simulated data, and the superiority of robust estimation is quantitatively assessed.
Resumo:
The shifts in the four-body recombination peaks, due to an effective range correction to the zero-range model close to the unitary limit, are obtained and used to extract the corresponding effective range of a given atomic system. The approach is applied to an ultracold gas of cesium atoms close to broad Feshbach resonances, where deviations of experimental values from universal model predictions are associated with effective range corrections. The effective range correction is extracted with a weighted average given by 3.9±0.8R vdW, where RvdW is the van der Waals length scale, which is consistent with the van der Waals potential tail for the Cs2 system. The method can be generally applied to other cold atom experimental setups to determine the contribution of the effective range to the tetramer dissociation position. © 2013 American Physical Society.
Resumo:
A gestão por competências destacou-se no cenário das organizações públicas brasileiras através do Decreto nº 5.707/2006 como instrumento da política de desenvolvimento dos servidores da administração pública direta, autárquica e fundacional enfatizando a capacitação orientada para o desenvolvimento do conjunto de conhecimentos, habilidades e atitudes necessárias ao desempenho das funções dos servidores, visando ao alcance dos objetivos da instituição. Ocorre que desde 2001, conforme relato da Escola Nacional de Administração Pública - ENAP, dezesseis organizações públicas já praticavam preceitos da gestão por competências, e utilizavam além da capacitação outros processos relacionados a área de gestão de pessoas previstos na teoria sobre gestão por competências. Utilizando a Teoria do Desenvolvimento Organizacional, com método comparativo, estratégia estudo de caso múltiplo e análise documental do período de dez anos, de três organizações, CEF,e TCU, pertencentes ao grupo do relato ENAP, foram constatadas as hipóteses de que cada organização escolheu o procedimento mais adequado a sua estrutura e cultura organizacional para implantar a gestão por competências e que esta teve o processo de implantação em consonância com as respectivas áreas de planejamento e ainda, para implementar mudanças utilizando a gestão por competências não é necessário finalizar o mapeamento de competências individuais de todos os servidores das organizações. As mudanças podem iniciar, após a definição da missão, valores, objetivos estratégicos, visão de futuro (mapeamento de competências organizacionais). Concluindo que inexiste um modelo universal para implantação da gestão por competências, pois, cada organização é influenciada de forma diferente pela política, clima e cultura organizacional.
Resumo:
Los ensayos de bombeo son, sin lugar a dudas, una de las pruebas más fiables y de mayor interés que se hacen en el medio físico. No son pruebas estrictamente puntuales, dado que el bombeo atrae flujo desde distancias lejanas al pozo, la prueba tiene una excelente representatividad espacial. Los métodos de interpretación mediante ensayos de bombeo se empezaron a plantear en la primera mitad del pasado siglo. Con los ensayos de bombeo se puede calcular la transmisividad y coeficiente de almacenamiento de las formaciones acuíferas y suministran información sobre el tipo de acuífero, la calidad constructiva del pozo de extracción, la existencia de barreras impermeable o bordes de recarga próximos, e incluso en algunas circunstancias permiten el cálculo del área de embalse subterráneo. Desde mediados del siglo 20 existe una eficaz y abundante gama de métodos analítico-interpretativos de ensayos de bombeo, tanto en régimen permanente como transitorio. Estos métodos son ampliamente conocidos y están muy experimentados a lo largo de muchos países, sin embargo, hoy día, podrían utilizarse modelos de flujo para la interpretación, logrando la misma fiabilidad e incluso mejores posibilidades de análisis. Muchos ensayos que no pueden interpretarse porque las configuraciones del medio son demasiado complejas y no están disponibles, o no es posible, el desarrollo de métodos analíticos, tienen buena adaptación y en ocasiones muy fácil solución haciendo uso de los métodos numéricos de simulación del flujo. En esta tesis se ha buscado una vía de interpretar ensayos de bombeo haciendo uso de modelos de simulación del flujo. Se utiliza el modelo universal MODFLOW del United States Geological Survey, en el cual se configura una celda de simulación y mallado particularmente adecuados para el problema a tratar, se valida con los métodos analíticos existentes. Con la célula convenientemente validada se simulan otros casos en los que no existen métodos analíticos desarrollados dada la complejidad del medio físico a tratar y se sacan las oportunas conclusiones. Por último se desarrolla un modelo específico y la correspondiente aplicación de uso general para la interpretación numérica de ensayos de bombeo tanto con las configuraciones normales como con configuraciones complejas del medio físico. ABSTRACT Pumping tests are, without doubt, one of the most reliable and most interesting tests done in the physical environment. They are not strictly anecdotal evidence, since pumping flow attracts from far distances to the well, the test has excellent spatial representation. Methods of interpretation by pumping tests began to arise in the first half of last century. With pumping tests, can be calculated transmissivity and storage coefficient of the aquifer formations, and provide information on the type of aquifer, the construction quality of the well, the existence of waterproof barriers or borders next recharge, and even in some circumstances allow calculating the area of underground reservoir. Since the mid-20th century there is effective and abundant range of analytical interpretative pumping tests, both in steady state and transient methods. These methods are very widely known and experienced over many countries, however, nowadays, may flow models used for interpretation, obtaining equally reliable or even better possibilities for analysis. Many trials cannot be interpreted as environmental settings are too complex and are not available, or not possible, the development of analytical methods, have good adaptation and sometimes very easily solved using numerical flow simulation methods. This thesis has sought a way to interpret pumping tests using flow simulation models. MODFLOW universal model of United States Geological Survey, in which a simulation cell and meshing particularly suitable for the problem to be treated, is validated with existing analytical methods used is set. With suitably validated cell other cases where there are no analytical methods developed given the complexity of the physical environment to try and draw appropriate conclusions are simulated. Finally, a specific model and the corresponding application commonly used for numerical interpretation of pumping tests both with normal settings as complex configurations of the physical environment is developed.
Resumo:
Networked Learning, e-Learning and Technology Enhanced Learning have each been defined in different ways, as people's understanding about technology in education has developed. Yet each could also be considered as a terminology competing for a contested conceptual space. Theoretically this can be a ‘fertile trans-disciplinary ground for represented disciplines to affect and potentially be re-orientated by others’ (Parchoma and Keefer, 2012), as differing perspectives on terminology and subject disciplines yield new understandings. Yet when used in government policy texts to describe connections between humans, learning and technology, terms tend to become fixed in less fertile positions linguistically. A deceptively spacious policy discourse that suggests people are free to make choices conceals an economically-based assumption that implementing new technologies, in themselves, determines learning. Yet it actually narrows choices open to people as one route is repeatedly in the foreground and humans are not visibly involved in it. An impression that the effective use of technology for endless improvement is inevitable cuts off critical social interactions and new knowledge for multiple understandings of technology in people's lives. This paper explores some findings from a corpus-based Critical Discourse Analysis of UK policy for educational technology during the last 15 years, to help to illuminate the choices made. This is important when through political economy, hierarchical or dominant neoliberal logic promotes a single ‘universal model’ of technology in education, without reference to a wider social context (Rustin, 2013). Discourse matters, because it can ‘mould identities’ (Massey, 2013) in narrow, objective economically-based terms which 'colonise discourses of democracy and student-centredness' (Greener and Perriton, 2005:67). This undermines subjective social, political, material and relational (Jones, 2012: 3) contexts for those learning when humans are omitted. Critically confronting these structures is not considered a negative activity. Whilst deterministic discourse for educational technology may leave people unconsciously restricted, I argue that, through a close analysis, it offers a deceptively spacious theoretical tool for debate about the wider social and economic context of educational technology. Methodologically it provides insights about ways technology, language and learning intersect across disciplinary borders (Giroux, 1992), as powerful, mutually constitutive elements, ever-present in networked learning situations. In sharing a replicable approach for linguistic analysis of policy discourse I hope to contribute to visions others have for a broader theoretical underpinning for educational technology, as a developing field of networked knowledge and research (Conole and Oliver, 2002; Andrews, 2011).
Resumo:
A stylized macroeconomic model is developed with an indebted, heterogeneous Investment Banking Sector funded by borrowing from a retail banking sector. The government guarantees retail deposits. Investment banks choose how risky their activities should be. We compared the benefits of separated vs. universal banking modelled as a vertical integration of the retail and investment banks. The incidence of banking default is considered under different constellations of shocks and degrees of competitiveness. The benefits of universal banking rise in the volatility of idiosyncratic shocks to trading strategies and are positive even for very bad common shocks, even though government bailouts, which are costly, are larger compared to the case of separated banking entities. The welfare assessment of the structure of banks may depend crucially on the kinds of shock hitting the economy as well as on the efficiency of government intervention.
Resumo:
Agro-hydrological models have widely been used for optimizing resources use and minimizing environmental consequences in agriculture. SMCRN is a recently developed sophisticated model which simulates crop response to nitrogen fertilizer for a wide range of crops, and the associated leaching of nitrate from arable soils. In this paper, we describe the improvements of this model by replacing the existing approximate hydrological cascade algorithm with a new simple and explicit algorithm for the basic soil water flow equation, which not only enhanced the model performance in hydrological simulation, but also was essential to extend the model application to the situations where the capillary flow is important. As a result, the updated SMCRN model could be used for more accurate study of water dynamics in the soil-crop system. The success of the model update was demonstrated by the simulated results that the updated model consistently out-performed the original model in drainage simulations and in predicting time course soil water content in different layers in the soil-wheat system. Tests of the updated SMCRN model against data from 4 field crop experiments showed that crop nitrogen offtakes and soil mineral nitrogen in the top 90 cm were in a good agreement with the measured values, indicating that the model could make more reliable predictions of nitrogen fate in the crop-soil system, and thus provides a useful platform to assess the impacts of nitrogen fertilizer on crop yield and nitrogen leaching from different production systems. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Future land cover will have a significant impact on climate and is strongly influenced by the extent of agricultural land use. Differing assumptions of crop yield increase and carbon pricing mitigation strategies affect projected expansion of agricultural land in future scenarios. In the representative concentration pathway 4.5 (RCP4.5) from phase 5 of the Coupled Model Intercomparison Project (CMIP5), the carbon effects of these land cover changes are included, although the biogeophysical effects are not. The afforestation in RCP4.5 has important biogeophysical impacts on climate, in addition to the land carbon changes, which are directly related to the assumption of crop yield increase and the universal carbon tax. To investigate the biogeophysical climatic impact of combinations of agricultural crop yield increases and carbon pricing mitigation, five scenarios of land-use change based on RCP4.5 are used as inputs to an earth system model [Hadley Centre Global Environment Model, version 2-Earth System (HadGEM2-ES)]. In the scenario with the greatest increase in agricultural land (as a result of no increase in crop yield and no climate mitigation) there is a significant -0.49 K worldwide cooling by 2100 compared to a control scenario with no land-use change. Regional cooling is up to -2.2 K annually in northeastern Asia. Including carbon feedbacks from the land-use change gives a small global cooling of -0.067 K. This work shows that there are significant impacts from biogeophysical land-use changes caused by assumptions of crop yield and carbon mitigation, which mean that land carbon is not the whole story. It also elucidates the potential conflict between cooling from biogeophysical climate effects of land-use change and wider environmental aims.
Resumo:
We discuss the thermal dependence of the zero-bias electrical conductance for a quantum dot embedded in a quantum wire, or side-coupled to it. In the Kondo regime, the temperature-dependent conductances map linearly onto the conductance for the symmetric Anderson Hamiltonian. The mapping fits accurately numerical renormalization-group results for the conductance in each geometry. In the side-coupled geometry, the conductance is markedly affected by a gate potential applied to the wire; in the embedded geometry, it is not. © 2010 IOP Publishing Ltd.
Resumo:
Within general characteristics of low-energy few-body systems, we revise some well-known correlations found in nuclear physics, and the properties of low-mass halo nuclei in a three-body neutron-neutron-core model. In this context, near the critical conditions for the occurrence of an Efimov state, we report some results obtained for the neutron- 19C elastic scattering. © 2010 American Institute of Physics.