57 resultados para ETL Conceptual and Logical Modeling
em CentAUR: Central Archive University of Reading - UK
Resumo:
Investigation of preferred structures of planetary wave dynamics is addressed using multivariate Gaussian mixture models. The number of components in the mixture is obtained using order statistics of the mixing proportions, hence avoiding previous difficulties related to sample sizes and independence issues. The method is first applied to a few low-order stochastic dynamical systems and data from a general circulation model. The method is next applied to winter daily 500-hPa heights from 1949 to 2003 over the Northern Hemisphere. A spatial clustering algorithm is first applied to the leading two principal components (PCs) and shows significant clustering. The clustering is particularly robust for the first half of the record and less for the second half. The mixture model is then used to identify the clusters. Two highly significant extratropical planetary-scale preferred structures are obtained within the first two to four EOF state space. The first pattern shows a Pacific-North American (PNA) pattern and a negative North Atlantic Oscillation (NAO), and the second pattern is nearly opposite to the first one. It is also observed that some subspaces show multivariate Gaussianity, compatible with linearity, whereas others show multivariate non-Gaussianity. The same analysis is also applied to two subperiods, before and after 1978, and shows a similar regime behavior, with a slight stronger support for the first subperiod. In addition a significant regime shift is also observed between the two periods as well as a change in the shape of the distribution. The patterns associated with the regime shifts reflect essentially a PNA pattern and an NAO pattern consistent with the observed global warming effect on climate and the observed shift in sea surface temperature around the mid-1970s.
Resumo:
We investigate the impact of past climates on plant diversification by tracking the "footprint" of climate change on a phylogenetic tree. Diversity within the cosmopolitan carnivorous plant genus Drosera (Droseraceae) is focused within Mediterranean climate regions. We explore whether this diversity is temporally linked to Mediterranean-type climatic shifts of the mid-Miocene and whether climate preferences are conservative over phylogenetic timescales. Phyloclimatic modeling combines environmental niche (bioclimatic) modeling with phylogenetics in order to study evolutionary patterns in relation to climate change. We present the largest and most complete such example to date using Drosera. The bioclimatic models of extant species demonstrate clear phylogenetic patterns; this is particularly evident for the tuberous sundews from southwestern Australia (subgenus Ergaleium). We employ a method for establishing confidence intervals of node ages on a phylogeny using replicates from a Bayesian phylogenetic analysis. This chronogram shows that many clades, including subgenus Ergaleium and section Bryastrum, diversified during the establishment of the Mediterranean-type climate. Ancestral reconstructions of bioclimatic models demonstrate a pattern of preference for this climate type within these groups. Ancestral bioclimatic models are projected into palaeo-climate reconstructions for the time periods indicated by the chronogram. We present two such examples that each generate plausible estimates of ancestral lineage distribution, which are similar to their current distributions. This is the first study to attempt bioclimatic projections on evolutionary time scales. The sundews appear to have diversified in response to local climate development. Some groups are specialized for Mediterranean climates, others show wide-ranging generalism. This demonstrates that Phyloclimatic modeling could be repeated for other plant groups and is fundamental to the understanding of evolutionary responses to climate change.
Resumo:
Polycondensation of 2,6-dihydroxynaphthalene with 4,4'-bis(4"-fluorobenzoyl)biphenyl affords a novel, semicrystalline poly(ether ketone) with a melting point of 406 degreesC and glass transition temperature (onset) of 168 degreesC. Molecular modeling and diffraction-simulation studies of this polymer, coupled with data from the single-crystal structure of an oligomer model, have enabled the crystal and molecular structure of the polymer to be determined from X-ray powder data. This structure-the first for any naphthalene-containing poly(ether ketone)-is fully ordered, in monoclinic space group P2(1)/b, with two chains per unit cell. Rietveld refinement against the experimental powder data gave a final agreement factor (R-wp) of 6.7%.
Resumo:
McDaniel, Robinson-Riegler, and Einstein (1998) recently reported findings in support of the proposal that prospective remembering is largely conceptually driven. In each of the three experiments they reported, however, the task in which the prospective memory target was encountered at test had a predominantly conceptual focus, thereby potentially facilitating retrieval of conceptually encoded features of the studied target event. We report two experiments in which we manipulated the dimension (perceptual or conceptual) along which a target event varied between study and test while using a processing task, at both study and test, compatible with the relevant dimension of target change. When the target was encountered in a sentence validity task at study and test, and the semantic context in which a target was encountered was changed between these two occasions, prospective remembering declined (Experiment 1). A similar decline occurred, using a readability rating task, when the perceptual context (font in which the word was printed) was altered (Experiment 2). These results indicate that both perceptual and conceptual processes can support prospective remembering.
Resumo:
This paper examines the interaction of spatial and dynamic aspects of resource extraction from forests by local people. Highly cyclical and varied across space and time, the patterns of resource extraction resulting from the spatial–temporal model bear little resemblance to the patterns drawn from focusing either on spatial or temporal aspects of extraction alone. Ignoring this variability inaccurately depicts villagers’ dependence on different parts of the forest and could result in inappropriate policies. Similarly, the spatial links in extraction decisions imply that policies imposed in one area can have unintended consequences in other areas. Combining the spatial–temporal model with a measure of success in community forest management—the ability to avoid open-access resource degradation—characterizes the impact of incomplete property rights on patterns of resource extraction and stocks.
Resumo:
This paper explores the place of aims in the early years foundation stage outdoor environment in England. Through examining the writing of academics, various themes are identified, and constructed into possible aims. These themes/aims are compared to an empirical study of early years teachers’ attitudes. Data was collected by questionnaire from schools within the University of Reading partnership. There was general agreement between experts and teachers as to the aims. While some respondents were able to explain what the aims of outdoor activity were, a significant number were unable to identify aims; further, a significant number did not distinguish between approach/practice and aims. A lack of understanding and agreement as to what the aims are may indicate teachers are unsure about the purpose of outdoor education for young children. A result of this study is to agree and make explicit the aims for outdoor education in the early years.
Resumo:
Novel imaging techniques are playing an increasingly important role in drug development, providing insight into the mechanism of action of new chemical entities. The data sets obtained by these methods can be large with complex inter-relationships, but the most appropriate statistical analysis for handling this data is often uncertain - precisely because of the exploratory nature of the way the data are collected. We present an example from a clinical trial using magnetic resonance imaging to assess changes in atherosclerotic plaques following treatment with a tool compound with established clinical benefit. We compared two specific approaches to handle the correlations due to physical location and repeated measurements: two-level and four-level multilevel models. The two methods identified similar structural variables, but higher level multilevel models had the advantage of explaining a greater proportion of variation, and the modeling assumptions appeared to be better satisfied.
Resumo:
It is well known that there is a dynamic relationship between cerebral blood flow (CBF) and cerebral blood volume (CBV). With increasing applications of functional MRI, where the blood oxygen-level-dependent signals are recorded, the understanding and accurate modeling of the hemodynamic relationship between CBF and CBV becomes increasingly important. This study presents an empirical and data-based modeling framework for model identification from CBF and CBV experimental data. It is shown that the relationship between the changes in CBF and CBV can be described using a parsimonious autoregressive with exogenous input model structure. It is observed that neither the ordinary least-squares (LS) method nor the classical total least-squares (TLS) method can produce accurate estimates from the original noisy CBF and CBV data. A regularized total least-squares (RTLS) method is thus introduced and extended to solve such an error-in-the-variables problem. Quantitative results show that the RTLS method works very well on the noisy CBF and CBV data. Finally, a combination of RTLS with a filtering method can lead to a parsimonious but very effective model that can characterize the relationship between the changes in CBF and CBV.
Resumo:
Stochastic methods are a crucial area in contemporary climate research and are increasingly being used in comprehensive weather and climate prediction models as well as reduced order climate models. Stochastic methods are used as subgrid-scale parameterizations (SSPs) as well as for model error representation, uncertainty quantification, data assimilation, and ensemble prediction. The need to use stochastic approaches in weather and climate models arises because we still cannot resolve all necessary processes and scales in comprehensive numerical weather and climate prediction models. In many practical applications one is mainly interested in the largest and potentially predictable scales and not necessarily in the small and fast scales. For instance, reduced order models can simulate and predict large-scale modes. Statistical mechanics and dynamical systems theory suggest that in reduced order models the impact of unresolved degrees of freedom can be represented by suitable combinations of deterministic and stochastic components and non-Markovian (memory) terms. Stochastic approaches in numerical weather and climate prediction models also lead to the reduction of model biases. Hence, there is a clear need for systematic stochastic approaches in weather and climate modeling. In this review, we present evidence for stochastic effects in laboratory experiments. Then we provide an overview of stochastic climate theory from an applied mathematics perspective. We also survey the current use of stochastic methods in comprehensive weather and climate prediction models and show that stochastic parameterizations have the potential to remedy many of the current biases in these comprehensive models.
Resumo:
Key Performance Indicators (KPIs) are the main instruments of Business Performance Management. KPIs are the measures that are translated to both the strategy and the business process. These measures are often designed for an industry sector with the assumptions about business processes in organizations. However, the assumptions can be too incomplete to guarantee the required properties of KPIs. This raises the need to validate the properties of KPIs prior to their application to performance measurement. This paper applies the method called EXecutable Requirements Engineering Management and Evolution (EXTREME) for validation of the KPI definitions. EXTREME semantically relates the goal modeling, conceptual modeling and protocol modeling techniques into one methodology. The synchronous composition built into protocol modeling enables raceability of goals in protocol models and constructive definitions of a KPI. The application of the method clarifies the meaning of KPI properties and procedures of their assessment and validation.
Resumo:
In this paper, I seek to undermine G.A. Cohen’s polemical use of a metaethical claim he makes in his article, ‘Facts and Principles’, by arguing that that use requires an unsustainable equivocation between epistemic and logical grounding. I begin by distinguishing three theses that Cohen has offered during the course of his critique of Rawls and contractualism more generally, the foundationalism about grounding thesis, the justice as non-regulative thesis, and the justice as all-encompassing thesis, and briefly argue that they are analytically independent of each other. I then offer an outline of the foundationalism about grounding thesis, characterising it, as Cohen does, as a demand of logic. That thesis claims that whenever a normative principle is dependent on a fact, it is so dependent in virtue of some other principle. I then argue that although this is true as a matter of logic, it, as Cohen admits, cannot be true of actual justifications, since logic cannot tell us anything about the truth as opposed to the validity of arguments. Facts about a justification cannot then be decisive for whether or not a given argument violates the foundationalism about grounding thesis. As long as, independently of actual justifications, theorists can point to plausible logically grounding principles, as I argue contractualists can, Cohen’s thesis lacks critical bite.
Resumo:
This edited collection brings together international experts from the vibrant and growing field of geographies of children, youth and families. The book provides an overview of current conceptual and theoretical debates, and gives a wide range of examples of cutting-edge research from a variety of national contexts across the globe. The theme of 'disentangling the socio-spatial contexts of young people and/or their families' advances debates in geographies and social studies of young people and families by emphasising the context of young people's social agency. The book is designed to provide an introduction to the topic of geographies of children, youth and families and is an invaluable course text for undergraduate and postgraduate students of geography and the social sciences. This interdisciplinary text is also of likely interest to students and practitioners of education, youth work, social policy and social work.
Resumo:
The prediction of climate variability and change requires the use of a range of simulation models. Multiple climate model simulations are needed to sample the inherent uncertainties in seasonal to centennial prediction. Because climate models are computationally expensive, there is a tradeoff between complexity, spatial resolution, simulation length, and ensemble size. The methods used to assess climate impacts are examined in the context of this trade-off. An emphasis on complexity allows simulation of coupled mechanisms, such as the carbon cycle and feedbacks between agricultural land management and climate. In addition to improving skill, greater spatial resolution increases relevance to regional planning. Greater ensemble size improves the sampling of probabilities. Research from major international projects is used to show the importance of synergistic research efforts. The primary climate impact examined is crop yield, although many of the issues discussed are relevant to hydrology and health modeling. Methods used to bridge the scale gap between climate and crop models are reviewed. Recent advances include large-area crop modeling, quantification of uncertainty in crop yield, and fully integrated crop–climate modeling. The implications of trends in computer power, including supercomputers, are also discussed.