861 resultados para Multi Domain Information Model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent coordinated observations of interplanetary scintillation (IPS) from the EISCAT, MERLIN, and STELab, and stereoscopic white-light imaging from the two heliospheric imagers (HIs) onboard the twin STEREO spacecraft are significant to continuously track the propagation and evolution of solar eruptions throughout interplanetary space. In order to obtain a better understanding of the observational signatures in these two remote-sensing techniques, the magnetohydrodynamics of the macro-scale interplanetary disturbance and the radio-wave scattering of the micro-scale electron-density fluctuation are coupled and investigated using a newly constructed multi-scale numerical model. This model is then applied to a case of an interplanetary shock propagation within the ecliptic plane. The shock could be nearly invisible to an HI, once entering the Thomson-scattering sphere of the HI. The asymmetry in the optical images between the western and eastern HIs suggests the shock propagation off the Sun–Earth line. Meanwhile, an IPS signal, strongly dependent on the local electron density, is insensitive to the density cavity far downstream of the shock front. When this cavity (or the shock nose) is cut through by an IPS ray-path, a single speed component at the flank (or the nose) of the shock can be recorded; when an IPS ray-path penetrates the sheath between the shock nose and this cavity, two speed components at the sheath and flank can be detected. Moreover, once a shock front touches an IPS ray-path, the derived position and speed at the irregularity source of this IPS signal, together with an assumption of a radial and constant propagation of the shock, can be used to estimate the later appearance of the shock front in the elongation of the HI field of view. The results of synthetic measurements from forward modelling are helpful in inferring the in-situ properties of coronal mass ejection from real observational data via an inverse approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Much consideration is rightly given to the design of metadata models to describe data. At the other end of the data-delivery spectrum much thought has also been given to the design of geospatial delivery interfaces such as the Open Geospatial Consortium standards, Web Coverage Service (WCS), Web Map Server and Web Feature Service (WFS). Our recent experience with the Climate Science Modelling Language shows that an implementation gap exists where many challenges remain unsolved. To bridge this gap requires transposing information and data from one world view of geospatial climate data to another. Some of the issues include: the loss of information in mapping to a common information model, the need to create ‘views’ onto file-based storage, and the need to map onto an appropriate delivery interface (as with the choice between WFS and WCS for feature types with coverage-valued properties). Here we summarise the approaches we have taken in facing up to these problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Opportunistic land encroachment occurs in many low-income countries, gradually yet pervasively, until discrete areas of common land disappear. This paper, motivated by field observations in Karnataka, India, demonstrates that such an evolution of property rights from common to private may be efficient when the boundaries between common and private land are poorly defined, or ‘‘fuzzy.’’ Using a multi-period optimization model, and introducing the concept of stock and flow enforcement, I show how effectiveness of enforcement effort, whether encroachment is reversible, and punitive fines, influence whether an area of common land is fully defined and protected or gradually or rapidly encroached.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently major processor manufacturers have announced a dramatic shift in their paradigm to increase computing power over the coming years. Instead of focusing on faster clock speeds and more powerful single core CPUs, the trend clearly goes towards multi core systems. This will also result in a paradigm shift for the development of algorithms for computationally expensive tasks, such as data mining applications. Obviously, work on parallel algorithms is not new per se but concentrated efforts in the many application domains are still missing. Multi-core systems, but also clusters of workstations and even large-scale distributed computing infrastructures provide new opportunities and pose new challenges for the design of parallel and distributed algorithms. Since data mining and machine learning systems rely on high performance computing systems, research on the corresponding algorithms must be on the forefront of parallel algorithm research in order to keep pushing data mining and machine learning applications to be more powerful and, especially for the former, interactive. To bring together researchers and practitioners working in this exciting field, a workshop on parallel data mining was organized as part of PKDD/ECML 2006 (Berlin, Germany). The six contributions selected for the program describe various aspects of data mining and machine learning approaches featuring low to high degrees of parallelism: The first contribution focuses the classic problem of distributed association rule mining and focuses on communication efficiency to improve the state of the art. After this a parallelization technique for speeding up decision tree construction by means of thread-level parallelism for shared memory systems is presented. The next paper discusses the design of a parallel approach for dis- tributed memory systems of the frequent subgraphs mining problem. This approach is based on a hierarchical communication topology to solve issues related to multi-domain computational envi- ronments. The forth paper describes the combined use and the customization of software packages to facilitate a top down parallelism in the tuning of Support Vector Machines (SVM) and the next contribution presents an interesting idea concerning parallel training of Conditional Random Fields (CRFs) and motivates their use in labeling sequential data. The last contribution finally focuses on very efficient feature selection. It describes a parallel algorithm for feature selection from random subsets. Selecting the papers included in this volume would not have been possible without the help of an international Program Committee that has provided detailed reviews for each paper. We would like to also thank Matthew Otey who helped with publicity for the workshop.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Climate modeling is a complex process, requiring accurate and complete metadata in order to identify, assess and use climate data stored in digital repositories. The preservation of such data is increasingly important given the development of ever-increasingly complex models to predict the effects of global climate change. The EU METAFOR project has developed a Common Information Model (CIM) to describe climate data and the models and modelling environments that produce this data. There is a wide degree of variability between different climate models and modelling groups. To accommodate this, the CIM has been designed to be highly generic and flexible, with extensibility built in. METAFOR describes the climate modelling process simply as "an activity undertaken using software on computers to produce data." This process has been described as separate UML packages (and, ultimately, XML schemas). This fairly generic structure canbe paired with more specific "controlled vocabularies" in order to restrict the range of valid CIM instances. The CIM will aid digital preservation of climate models as it will provide an accepted standard structure for the model metadata. Tools to write and manage CIM instances, and to allow convenient and powerful searches of CIM databases,. Are also under development. Community buy-in of the CIM has been achieved through a continual process of consultation with the climate modelling community, and through the METAFOR team’s development of a questionnaire that will be used to collect the metadata for the Intergovernmental Panel on Climate Change’s (IPCC) Coupled Model Intercomparison Project Phase 5 (CMIP5) model runs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Metafor project has developed a common information model (CIM) using the ISO19100 series for- malism to describe numerical experiments carried out by the Earth system modelling community, the models they use, and the simulations that result. Here we describe the mechanism by which the CIM was developed, and its key properties. We introduce the conceptual and application ver- sions and the controlled vocabularies developed in the con- text of supporting the fifth Coupled Model Intercomparison Project (CMIP5). We describe how the CIM has been used in experiments to describe model coupling properties and de- scribe the near term expected evolution of the CIM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Health care provision is significantly impacted by the ability of the health providers to engineer a viable healthcare space to support care stakeholders needs. In this paper we discuss and propose use of organisational semiotics as a set of methods to link stakeholders to systems, which allows us to capture clinician activity, information transfer, and building use; which in tern allows us to define the value of specific systems in the care environment to specific stakeholders and the dependence between systems in a care space. We suggest use of a semantically enhanced building information model (BIM) to support the linking of clinician activity to the physical resource objects and space; and facilitate the capture of quantifiable data, over time, concerning resource use by key stakeholders. Finally we argue for the inclusion of appropriate stakeholder feedback and persuasive mechanism, to incentivise building user behaviour to support organisational level sustainability policy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Methods for recombinant production of eukaryotic membrane proteins, yielding sufficient quantity and quality of protein for structural biology, remain a challenge. We describe here, expression and purification optimisation of the human SERCA2a cardiac isoform of Ca2+ translocating ATPase, using Saccharomyces cerevisiae as the heterologous expression system of choice. Two different expression vectors were utilised, allowing expression of C-terminal fusion proteins with a biotinylation domain or a GFP- His8 tag. Solubilised membrane fractions containing the protein of interest were purified onto Streptavidin-Sepharose, Ni-NTA or Talon resin, depending on the fusion tag present. Biotinylated protein was detected using specific antibody directed against SERCA2 and, advantageously, GFP-His8 fusion protein was easily traced during the purification steps using in-gel fluorescence. Importantly, talon resin affinity purification proved more specific than Ni-NTA resin for the GFP-His8 tagged protein, providing better separation of oligomers present, during size exclusion chromatography. The optimised method for expression and purification of human cardiac SERCA2a reported herein, yields purified protein (> 90%) that displays a calcium-dependent thapsigargin-sensitive activity and is suitable for further biophysical, structural and physiological studies. This work provides support for the use of Saccharomyces cerevisiae as a suitable expression system for recombinant production of multi-domain eukaryotic membrane proteins.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The human ROCO proteins are a family of multi-domain proteins sharing a conserved ROC-COR supra-domain. The family has four members: leu- cine-rich repeat kinase 1 (LRRK1), leucine-rich repeat kinase 2 (LRRK2), death-associated protein kinase 1 (DAPK1) and malignant fibrous histiocy- toma amplified sequences with leucine-rich tandem repeats 1 (MASL1). Previous studies of LRRK1/2 and DAPK1 have shown that the ROC (Ras of complex proteins) domain can bind and hydrolyse GTP, but the cellular consequences of this activity are still unclear. Here, the first biochemical characterization of MASL1 and the impact of GTP binding on MASL1 complex formation are reported. The results demonstrate that MASL1, similar to other ROCO proteins, can bind guanosine nucleotides via its ROC domain. Furthermore, MASL1 exists in two distinct cellular com- plexes associated with heat shock protein 60, and the formation of a low molecular weight pool of MASL1 is modulated by GTP binding. Finally, loss of GTP enhances MASL1 toxicity in cells. Taken together, these data point to a central role for the ROC/GTPase domain of MASL1 in the reg- ulation of its cellular function.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Concentrated solar power systems are expected to be sited in desert locations where the direct normal irradiation is above 1800 kWh/m2.year. These systems include large solar collector assemblies, which account for a significant share of the investment cost. Solarreflectors are the main components of these solar collector assemblies and dust/sand storms may affect their reflectance properties, either by soiling or by surface abrasion. While soiling can be reverted by cleaning, surface abrasion is a non reversible degradation.The aim of this project was to study the accelerated aging of second surface silvered thickglass solar reflectors under simulated sandstorm conditions and develop a multi-parametric model which relates the specular reflectance loss to dust/sand storm parameters: wind velocity, dust concentration and time of exposure. This project focused on the degradation caused by surface abrasion.Sandstorm conditions were simulated in a prototype environmental test chamber. Material samples (6cm x 6cm) were exposed to Arizona coarse test dust. The dust stream impactedthese material samples at a perpendicular angle. Both wind velocity and dust concentrationwere maintained at a stable level for each accelerated aging test. The total exposure time in the test chamber was limited to 1 hour. Each accelerated aging test was interrupted every 4 minutes to measure the specular reflectance of the material sample after cleaning.The accelerated aging test campaign had to be aborted prematurely due to a contamination of the dust concentration sensor. A robust multi-parametric degradation model could thus not be derived. The experimental data showed that the specular reflectance loss decreasedeither linearly or exponentially with exposure time, so that a degradation rate could be defined as a single modeling parameter. A correlation should be derived to relate this degradation rate to control parameters such as wind velocity and dust/sand concentration.The sandstorm chamber design would have to be updated before performing further accelerated aging test campaigns. The design upgrade should improve both the reliability of the test equipment and the repeatability of accelerated aging tests. An outdoor exposure test campaign should be launched in deserts to learn more about the intensity, frequencyand duration of dust/sand storms. This campaign would also serve to correlate the results of outdoor exposure tests with accelerated exposure tests in order to develop a robust service lifetime prediction model for different types of solar reflector materials.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most of water distribution systems (WDS) need rehabilitation due to aging infrastructure leading to decreasing capacity, increasing leakage and consequently low performance of the WDS. However an appropriate strategy including location and time of pipeline rehabilitation in a WDS with respect to a limited budget is the main challenge which has been addressed frequently by researchers and practitioners. On the other hand, selection of appropriate rehabilitation technique and material types is another main issue which has yet to address properly. The latter can affect the environmental impacts of a rehabilitation strategy meeting the challenges of global warming mitigation and consequent climate change. This paper presents a multi-objective optimization model for rehabilitation strategy in WDS addressing the abovementioned criteria mainly focused on greenhouse gas (GHG) emissions either directly from fossil fuel and electricity or indirectly from embodied energy of materials. Thus, the objective functions are to minimise: (1) the total cost of rehabilitation including capital and operational costs; (2) the leakage amount; (3) GHG emissions. The Pareto optimal front containing optimal solutions is determined using Non-dominated Sorting Genetic Algorithm NSGA-II. Decision variables in this optimisation problem are classified into a number of groups as: (1) percentage proportion of each rehabilitation technique each year; (2) material types of new pipeline for rehabilitation each year. Rehabilitation techniques used here includes replacement, rehabilitation and lining, cleaning, pipe duplication. The developed model is demonstrated through its application to a Mahalat WDS located in central part of Iran. The rehabilitation strategy is analysed for a 40 year planning horizon. A number of conventional techniques for selecting pipes for rehabilitation are analysed in this study. The results show that the optimal rehabilitation strategy considering GHG emissions is able to successfully save the total expenses, efficiently decrease the leakage amount from the WDS whilst meeting environmental criteria.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The goal of this paper is to show the possibility of a non-monotone relation between coverage ans risk which has been considered in the literature of insurance models since the work of Rothschild and Stiglitz (1976). We present an insurance model where the insured agents have heterogeneity in risk aversion and in lenience (a prevention cost parameter). Risk aversion is described by a continuous parameter which is correlated with lenience and for the sake of simplicity, we assume perfect correlation. In the case of positive correlation, the more risk averse agent has higher cosr of prevention leading to a higher demand for coverage. Equivalently, the single crossing property (SCP) is valid and iplies a positive correlation between overage and risk in equilibrium. On the other hand, if the correlation between risk aversion and lenience is negative, not only may the SCP be broken, but also the monotonocity of contracts, i.e., the prediction that high (low) risk averse types choose full (partial) insurance. In both cases riskiness is monotonic in risk aversion, but in the last case there are some coverage levels associated with two different risks (low and high), which implies that the ex-ante (with respect to the risk aversion distribution) correlation between coverage and riskiness may have every sign (even though the ex-post correlation is always positive). Moreover, using another instrument (a proxy for riskiness), we give a testable implication to desentangle single crossing ans non single croosing under an ex-post zero correlation result: the monotonicity of coverage as a function os riskiness. Since by controlling for risk aversion (no asymmetric information), coverage is monotone function of riskiness, this also fives a test for asymmetric information. Finally, we relate this theoretical results to empirical tests in the recent literature, specially the Dionne, Gouruéroux and Vanasse (2001) work. In particular, they found an empirical evidence that seems to be compatible with asymmetric information and non single crossing in our framework. More generally, we build a hidden information model showing how omitted variables (asymmetric information) can bias the sign of the correlation of equilibrium variables conditioning on all observable variables. We show that this may be the case when the omitted variables have a non-monotonic relation with the observable ones. Moreover, because this non-dimensional does not capture this deature. Hence, our main results is to point out the importance of the SPC in testing predictions of the hidden information models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The goal of t.his paper is to show the possibility of a non-monot.one relation between coverage and risk which has been considered in the literature of insurance models since the work of Rothschild and Stiglitz (1976). We present an insurance model where the insured agents have heterogeneity in risk aversion and in lenience (a prevention cost parameter). Risk aversion is described by a continuou.'l parameter which is correlated with lenience and, for the sake of simplicity, we assume perfect correlation. In the case of positive correlation, the more risk averse agent has higher cost of prevention leading to a higher demand for coverage. Equivalently, the single crossing property (SCP) is valid and implies a positive correlation between coverage and risk in equilibrium. On the other hand, if the correlation between risk aversion and lenience is negative, not only may the sep be broken, but also the monotonicity of contracts, i.e., the prediction that high (Iow) risk averse types choose full (partial) insurance. In both cases riskiness is monotonic in risk aversion, but in the last case t,here are some coverage leveIs associated with two different risks (low and high), which implies that the ex-ante (with respect to the risk aversion distribution) correlation bet,ween coverage and riskiness may have every sign (even though the ex-post correlation is always positive). Moreover, using another instrument (a proxy for riskiness), we give a testable implication to disentangle single crossing and non single crossing under an ex-post zero correlation result: the monotonicity of coverage as a function of riskiness. Since by controlling for risk aversion (no asymmetric informat, ion), coverage is a monotone function of riskiness, this also gives a test for asymmetric information. Finally, we relate this theoretical results to empirica! tests in the recent literature, specially the Dionne, Gouriéroux and Vanasse (2001) work. In particular, they found an empirical evidence that seems to be compatible with asymmetric information and non single crossing in our framework. More generally, we build a hidden information model showing how omitted variabIes (asymmetric information) can bias the sign of the correlation of equilibrium variabIes conditioning on ali observabIe variabIes. We show that this may be t,he case when the omitted variabIes have a non-monotonic reIation with t,he observable ones. Moreover, because this non-monotonic reIat,ion is deepIy reIated with the failure of the SCP in one-dimensional screening problems, the existing lit.erature on asymmetric information does not capture t,his feature. Hence, our main result is to point Out the importance of t,he SCP in testing predictions of the hidden information models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this study was to apply factor analysis to describe lactation curves in dairy buffaloes in order to estimate the phenotypic and genetic association between common latent factors and cumulative milk yield. A total of 31 257 monthly test-day milk yield records from buffaloes belonging to herds located in the state of São Paulo were used to estimate two common latent factors, which were then analysed in a multi-trait animal model for estimating genetic parameters. Estimates of (co)variance components for the two common latent factors and cumulated 270-d milk yield were obtained by Bayesian inference using a multiple trait animal model. Contemporary group, number of milkings per day (two levels) and age of buffalo cow at calving (linear and quadratic) as covariate were included in the model as fixed effects. The additive genetic, permanent environmental and residual effects were included as random effects. The first common latent factor (F1) was associated with persistency of lactation and the second common latent factor (F2) with the level of production in early lactation. Heritability estimates for Fl and F2 were 0.12 and 0.07, respectively. Genetic correlation estimates between El and F2 with cumulative milk yield were positive and moderate (0.63 and 0.52). Multivariate statistics employing factor analysis allowed the extraction of two variables (latent factors) that described the shape of the lactation curve. It is expected that the response to selection to increase lactation persistency is higher than the response obtained from selecting animals to increase lactation peak. Selection for higher total milk yield would result in a favourable correlated response to increase the level of production in early lactation and the lactation persistency.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Apresentamos, neste trabalho, com base na semântica cognitiva, uma análise do significado, em contexto, dos auxiliares modais poder, precisar e dever. Analisamos 120 textos produzidos por candidatos ao vestibular e por alunos do ensino fundamental, como resposta da questão número três da prova discursiva de Língua Portuguesa do vestibular 2005 da UFRN, que pede aos candidatos para explicitar a diferença de sentido entre três frases, observando o uso desses três verbos. Consideramos que um item lexical não é incorporado a uma representação lingüística semântica fixa, limitada e única, mas antes, é ligado a uma representação lingüística semântica flexível e aberta que provê acesso a muitas concepções e sistemas conceituais dependente de cada contexto determinado. Com base em seu significado, um item lexical evoca um grupo de domínios cognitivos, que por sua vez, apresentam um determinado conteúdo conceitual. Isto implica em afirmar que a rede de significados lexicais vai variar conforme o conhecimento de mundo de cada um (LANGACKER, 2000). A relevância deste trabalho é proporcionar uma contribuição para a descrição semântica do português