900 resultados para ETL Conceptual and Logical Modeling
Resumo:
Improved udder health requires consistent application of appropriate management practices by those involved in managing dairy herds and the milking process. Designing effective communication requires that we understand why dairy herd managers behave in the way they do and also how the means of communication can be used both to inform and to influence. Social sciences- ranging from economics to anthropology - have been used to shed light on the behaviour of those who manage farm animals. Communication science tells us that influencing behaviour is not simply a question of „getting the message across‟ but of addressing the complex of factors that influence an individual‟s behavioural decisions. A review of recent studies in the animal health literature shows that different social science frameworks and methodologies offer complementary insights into livestock managers‟ behaviour but that the diversity of conceptual and methodological frameworks presents a challenge for animal health practitioners and policy makers who seek to make sense of the findings – and for researchers looking for helpful starting points. Data from a recent study in England illustrate the potential of „home-made‟ conceptual frameworks to help unravel the complexity of farmer behaviour. At the same time, though, the data indicate the difficulties facing those designing communication strategies in a context where farmers believe strongly that they are already doing all they can reasonably be expected to do to minimise animal health risks.
Resumo:
We know surprisingly little about whether the content of European Union legislation reflects the preferences of some Member States more than others. The few studies that have examined national bargaining success rates for EU legislation have conceptual and methodological weaknesses. To redress these problems I use a salience-weighted measure to gauge the relative success of Member States in translating their national preferences into legislation, and test two plausible, competing hypotheses about how the EU works: that no state consistently achieves more of what it really wants than any other, and that large Member States tend to beat small ones. Neither hypothesis receives empirical support. Not only do states differ far more significantly in their respective levels of bargaining success than previously recognised, but some of the smaller states are the ones that do especially well. The paper‟s main contribution -- demonstrating that the EU does not work as most people think it does -- sets the stage for new research questions, both positive and normative. In the last section I make a tentative start answering two of the most important: which factors explain the surprising empirical results, and whether differential national bargaining success might undermine the legitimacy of the integration process.
Resumo:
The necessity and benefits for establishing the international Earth-system Prediction Initiative (EPI) are discussed by scientists associated with the World Meteorological Organization (WMO) World Weather Research Programme (WWRP), World Climate Research Programme (WCRP), International Geosphere–Biosphere Programme (IGBP), Global Climate Observing System (GCOS), and natural-hazards and socioeconomic communities. The proposed initiative will provide research and services to accelerate advances in weather, climate, and Earth system prediction and the use of this information by global societies. It will build upon the WMO, the Group on Earth Observations (GEO), the Global Earth Observation System of Systems (GEOSS) and the International Council for Science (ICSU) to coordinate the effort across the weather, climate, Earth system, natural-hazards, and socioeconomic disciplines. It will require (i) advanced high-performance computing facilities, supporting a worldwide network of research and operational modeling centers, and early warning systems; (ii) science, technology, and education projects to enhance knowledge, awareness, and utilization of weather, climate, environmental, and socioeconomic information; (iii) investments in maintaining existing and developing new observational capabilities; and (iv) infrastructure to transition achievements into operational products and services.
Resumo:
The Metafor project has developed a common information model (CIM) using the ISO19100 series for- malism to describe numerical experiments carried out by the Earth system modelling community, the models they use, and the simulations that result. Here we describe the mechanism by which the CIM was developed, and its key properties. We introduce the conceptual and application ver- sions and the controlled vocabularies developed in the con- text of supporting the fifth Coupled Model Intercomparison Project (CMIP5). We describe how the CIM has been used in experiments to describe model coupling properties and de- scribe the near term expected evolution of the CIM.
Resumo:
We discuss the challenge to truth-conditional semantics presented by apparent shifts in extension of predicates such as ‘red’. We propose an explicit indexical semantics for ‘red’ and argue that our account is preferable to the alternatives on conceptual and empirical grounds.
Resumo:
An overview is provided of the current understanding of transport in the middle atmosphere. Over the past quarter century this subject has evolved from a basic recognition of the Brewer-Dobson circulation to a detailed appreciation of many key features of transport such as the stratospheric surf zone, mixing barriers and the dynamics of filamentation. Whilst the elegant theoretical framework for middle atmosphere transport that emerged roughly twenty years ago never fulfilled its promise, useful phenomenological models have been developed together with innovative diagnostic methods. These advances were made possible by the advent of plenty of satellite and aircraft observations of long-lived chemical species together with developments in data assimilation and numerical modeling, and have been driven in large measure by the problem of stratospheric ozone depletion. This review is primarily focused on the stratosphere, where both the interest and the knowledge are the greatest, but a few remarks are also made on the mesosphere.
Resumo:
Older people increasingly want to remain living independently in their own homes. The aim of the ENABLE project is to develop a wearable device that can be used to support older people in their daily lives and which can monitor their health status, detect potential problems, provide activity reminders and offer communication and alarm services. In order to determine the specifications and functionality required for the development of the device, user surveys and focus groups were undertaken, use case analysis and scenario modeling carried out. The project has resulted in the development of a wrist-worn device and mobile phone combination that can support and assist older and vulnerable wearers with a range of activities and services both inside their home and as they move around their local environment. The device is currently undergoing pilot trials in five European countries. The aim of this paper is to describe the ENABLE device, its features and services, and the infrastructure within which it operates.
Resumo:
We consider forecasting with factors, variables and both, modeling in-sample using Autometrics so all principal components and variables can be included jointly, while tackling multiple breaks by impulse-indicator saturation. A forecast-error taxonomy for factor models highlights the impacts of location shifts on forecast-error biases. Forecasting US GDP over 1-, 4- and 8-step horizons using the dataset from Stock and Watson (2009) updated to 2011:2 shows factor models are more useful for nowcasting or short-term forecasting, but their relative performance declines as the forecast horizon increases. Forecasts for GDP levels highlight the need for robust strategies, such as intercept corrections or differencing, when location shifts occur as in the recent financial crisis.
Resumo:
In the present research, a 3 × 2 model of achievement goals is proposed and tested. The model is rooted in the definition and valence components of competence, and encompasses 6 goal constructs: task-approach, task-avoidance, self-approach, self-avoidance, other-approach, and other-avoidance. The results from 2 studies provided strong support for the proposed model, most notably the need to separate task-based and self-based goals. Studies 1 and 2 yielded data establishing the 3 × 2 structure of achievement goals, and Study 2 documented the antecedents and consequences of each of the goals in the 3 × 2 model. Terminological, conceptual, and applied issues pertaining to the 3 × 2 model are discussed. (PsycINFO Database Record (c) 2012 APA, all rights reserved)(journal abstract)
Resumo:
A one-dimensional, thermodynamic, and radiative model of a melt pond on sea ice is presented that explicitly treats the melt pond as an extra phase. A two-stream radiation model, which allows albedo to be determined from bulk optical properties, and a parameterization of the summertime evolution of optical properties, is used. Heat transport within the sea ice is described using an equation describing heat transport in a mushy layer of a binary alloy (salt water). The model is tested by comparison of numerical simulations with SHEBA data and previous modeling. The presence of melt ponds on the sea ice surface is demonstrated to have a significant effect on the heat and mass balance. Sensitivity tests indicate that the maximum melt pond depth is highly sensitive to optical parameters and drainage. INDEX TERMS: 4207 Oceanography: General: Arctic and Antarctic oceanography; 4255 Oceanography: General: Numerical modeling; 4299 Oceanography: General: General or miscellaneous; KEYWORDS: sea ice, melt pond, albedo, Arctic Ocean, radiation model, thermodynamic
Resumo:
Fire is an important component of the Earth System that is tightly coupled with climate, vegetation, biogeochemical cycles, and human activities. Observations of how fire regimes change on seasonal to millennial timescales are providing an improved understanding of the hierarchy of controls on fire regimes. Climate is the principal control on fire regimes, although human activities have had an increasing influence on the distribution and incidence of fire in recent centuries. Understanding of the controls and variability of fire also underpins the development of models, both conceptual and numerical, that allow us to predict how future climate and land-use changes might influence fire regimes. Although fires in fire-adapted ecosystems can be important for biodiversity and ecosystem function, positive effects are being increasingly outweighed by losses of ecosystem services. As humans encroach further into the natural habitat of fire, social and economic costs are also escalating. The prospect of near-term rapid and large climate changes, and the escalating costs of large wildfires, necessitates a radical re-thinking and the development of approaches to fire management that promote the more harmonious co-existence of fire and people.
Resumo:
Classical regression methods take vectors as covariates and estimate the corresponding vectors of regression parameters. When addressing regression problems on covariates of more complex form such as multi-dimensional arrays (i.e. tensors), traditional computational models can be severely compromised by ultrahigh dimensionality as well as complex structure. By exploiting the special structure of tensor covariates, the tensor regression model provides a promising solution to reduce the model’s dimensionality to a manageable level, thus leading to efficient estimation. Most of the existing tensor-based methods independently estimate each individual regression problem based on tensor decomposition which allows the simultaneous projections of an input tensor to more than one direction along each mode. As a matter of fact, multi-dimensional data are collected under the same or very similar conditions, so that data share some common latent components but can also have their own independent parameters for each regression task. Therefore, it is beneficial to analyse regression parameters among all the regressions in a linked way. In this paper, we propose a tensor regression model based on Tucker Decomposition, which identifies not only the common components of parameters across all the regression tasks, but also independent factors contributing to each particular regression task simultaneously. Under this paradigm, the number of independent parameters along each mode is constrained by a sparsity-preserving regulariser. Linked multiway parameter analysis and sparsity modeling further reduce the total number of parameters, with lower memory cost than their tensor-based counterparts. The effectiveness of the new method is demonstrated on real data sets.
Resumo:
Adult or somatic stem cells are tissue-resident cells with the ability to proliferate, exhibit self-maintenance as well as to generate new cells with the principal phenotypes of the tissue in response to injury or disease. Due to their easy accessibility and their potential use in regenerative medicine, adult stem cells raise the hope for future personalisable therapies. After infection or during injury, they are exposed to broad range of pathogen or damage-associated molecules leading to changes in their proliferation, migration and differentiation. The sensing of such damage and infection signals is mostly achieved by Toll-Like Receptors (TLRs) with Toll-like receptor 4 being responsible for recognition of bacterial lipopolysaccharides (LPS) and endogenous danger-associated molecular patterns (DAMPs). In this review, we examine the current state of knowledge on the TLR4-mediated signalling in different adult stem cell populations. Specifically, we elaborate on the role of TLR4 and its ligands on proliferation, differentiation and migration of mesenchymal stem cells, hematopoietic stem cells as well as neural stem cells. Finally, we discuss conceptual and technical pitfalls in investigation of TLR4 signalling in stem cells.
Resumo:
Debate about the definition of “small state” has produced more fragmentation than consensus, even as the literature has demonstrated its subjects’ roles in joining international organizations propagating norms, executing creative diplomacy, influencing allies, avoiding and joining conflicts, and building peace. However, work on small states has struggled to identify commonalities in these states’ international relations, to cumulate knowledge, or to impact broader IR theory. This paper advocates a changed conceptual and definitional framework. Analysis of “small states” should pivot to examine the dynamics of the asymmetrical relationships in which these states are engaged. Instead of seeking an overall metric for size as the relevant variable—falling victim in a different way Dahl’s “lump-of-power fallacy,” we can recognize the multifaceted, variegated nature of power, whether in war or peacetime.
Resumo:
Techniques devoted to generating triangular meshes from intensity images either take as input a segmented image or generate a mesh without distinguishing individual structures contained in the image. These facts may cause difficulties in using such techniques in some applications, such as numerical simulations. In this work we reformulate a previously developed technique for mesh generation from intensity images called Imesh. This reformulation makes Imesh more versatile due to an unified framework that allows an easy change of refinement metric, rendering it effective for constructing meshes for applications with varied requirements, such as numerical simulation and image modeling. Furthermore, a deeper study about the point insertion problem and the development of geometrical criterion for segmentation is also reported in this paper. Meshes with theoretical guarantee of quality can also be obtained for each individual image structure as a post-processing step, a characteristic not usually found in other methods. The tests demonstrate the flexibility and the effectiveness of the approach.