880 resultados para Functional Requirements for Authority Data (FRAD)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

P>Dendritic cells (DCs) play an important role in the clearance of apoptotic cells. The removal of apoptotic cells leads to peripheral tolerance, although their role is still not clear. We show that the uptake of apoptotic thymocytes by DCs converts these cells into tolerogenic DCs resistant to maturation by lipopolysaccharide, modulating the production of interleukin-12 and up-regulating the expression of transforming growth factor-beta(1) latency associated peptide. We also observed that DCs pulsed with apoptotic cells in the allogeneic context were more efficient in the expansion of regulatory T cells (Tregs), and that this expansion requires contact between DCs and the T cell. The Tregs sorted from in vitro culture suppressed the proliferation of splenocytes in vitro in a specific and non-specific manner. In the in vivo model, the transfer of CD4+ CD25- cells to Nude mice induced autoimmunity, with cell infiltrate found in the stomach, colon, liver and kidneys. The co-transfer of CD4+ CD25- and CD4+ CD25+ prevented the presence of cell infiltrates in several organs and increased the total cell count in lymph nodes. Our data indicate that apoptotic cells have an important role in peripheral tolerance via induction of tolerogenic DCs and CD4+ CD25+ Foxp3+ cells that present regulatory functions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The TCABR data analysis and acquisition system has been upgraded to support a joint research programme using remote participation technologies. The architecture of the new system uses Java language as programming environment. Since application parameters and hardware in a joint experiment are complex with a large variability of components, requirements and specification solutions need to be flexible and modular, independent from operating system and computer architecture. To describe and organize the information on all the components and the connections among them, systems are developed using the extensible Markup Language (XML) technology. The communication between clients and servers uses remote procedure call (RPC) based on the XML (RPC-XML technology). The integration among Java language, XML and RPC-XML technologies allows to develop easily a standard data and communication access layer between users and laboratories using common software libraries and Web application. The libraries allow data retrieval using the same methods for all user laboratories in the joint collaboration, and the Web application allows a simple graphical user interface (GUI) access. The TCABR tokamak team in collaboration with the IPFN (Instituto de Plasmas e Fusao Nuclear, Instituto Superior Tecnico, Universidade Tecnica de Lisboa) is implementing this remote participation technologies. The first version was tested at the Joint Experiment on TCABR (TCABRJE), a Host Laboratory Experiment, organized in cooperation with the IAEA (International Atomic Energy Agency) in the framework of the IAEA Coordinated Research Project (CRP) on ""Joint Research Using Small Tokamaks"". (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The structural and electronic properties of perylene diimide liquid crystal PPEEB are studied using ab initio methods based on the density functional theory (I)FT). Using available experimental crystallographic data as a guide, we propose a detailed structural model for the packing of solid PPEEB. We find that due to the localized nature of the band edge wave function, theoretical approaches beyond the standard method, such as hybrid functional (PBE0), are required to correctly characterize the band structure of this material. Moreover, unlike previous assumptions, we observe the formation of hydrogen bonds between the side chains of different molecules, which leads to a dispersion of the energy levels. This result indicates that the side chains of the molecular crystal not only are responsible for its structural conformation but also can be used for tuning the electronic and optical properties of these materials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ground-state energies for anti ferromagnetic Heisenberg models with exchange anisotropy are estimated by means of a local-spin approximation made in the context of the density functional theory. Correlation energy is obtained using the non-linear spin-wave theory for homogeneous systems from which the spin functional is built. Although applicable to chains of any size, the results are shown for small number of sites, to exhibit finite-size effects and allow comparison with exact-numerical data from direct diagonalization of small chains. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lectins have been classified into a structurally diverse group of proteins that bind carbohydrates and glycoconjugates with high specificity. They are extremely useful molecules in the characterization of saccharides, as drug delivery mediators, and even as cellular surface makers. In this study, we present camptosemin, a new lectin from Camptosema ellipticum. It was characterized as an N-acetyl-d-galactosamine-binding homo-tetrameric lectin, with a molecular weight around 26 kDa/monomers. The monomers were stable over a wide range of pH values and exhibited pH-dependent oligomerization. Camptosemin promoted adhesion of breast cancer cells and hemagglutination, and both activities were inhibited by its binding of sugar. The stability and unfolding/folding behavior of this lectin was characterized using fluorescence and far-UV circular dichroism spectroscopies. The results indicate that chemical unfolding of camptosemin proceeds as a two-state monomer-tetramer process. In addition, small-angle X-ray scattering shows that camptosemin behaves as a soluble and stable homo-tetramer molecule in solution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigated the temporal dynamics and changes in connectivity in the mental rotation network through the application of spatio-temporal support vector machines (SVMs). The spatio-temporal SVM [Mourao-Miranda, J., Friston, K. J., et al. (2007). Dynamic discrimination analysis: A spatial-temporal SVM. Neuroimage, 36, 88-99] is a pattern recognition approach that is suitable for investigating dynamic changes in the brain network during a complex mental task. It does not require a model describing each component of the task and the precise shape of the BOLD impulse response. By defining a time window including a cognitive event, one can use spatio-temporal fMRI observations from two cognitive states to train the SVM. During the training, the SVM finds the discriminating pattern between the two states and produces a discriminating weight vector encompassing both voxels and time (i.e., spatio-temporal maps). We showed that by applying spatio-temporal SVM to an event-related mental rotation experiment, it is possible to discriminate between different degrees of angular disparity (0 degrees vs. 20 degrees, 0 degrees vs. 60 degrees, and 0 degrees vs. 100 degrees), and the discrimination accuracy is correlated with the difference in angular disparity between the conditions. For the comparison with highest accuracy (08 vs. 1008), we evaluated how the most discriminating areas (visual regions, parietal regions, supplementary, and premotor areas) change their behavior over time. The frontal premotor regions became highly discriminating earlier than the superior parietal cortex. There seems to be a parcellation of the parietal regions with an earlier discrimination of the inferior parietal lobe in the mental rotation in relation to the superior parietal. The SVM also identified a network of regions that had a decrease in BOLD responses during the 100 degrees condition in relation to the 0 degrees condition (posterior cingulate, frontal, and superior temporal gyrus). This network was also highly discriminating between the two conditions. In addition, we investigated changes in functional connectivity between the most discriminating areas identified by the spatio-temporal SVM. We observed an increase in functional connectivity between almost all areas activated during the 100 degrees condition (bilateral inferior and superior parietal lobe, bilateral premotor area, and SMA) but not between the areas that showed a decrease in BOLD response during the 100 degrees condition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Raman and IR experiments have been carried out on formamide (FA) and pyridine (Py) mixtures at different compositions. The appearance of a new Raman band at 996 cm(-1) (nu(1) region of Py), whose intensity depends on the FA concentration, is assigned to an FA: Py adduct and this result is in excellent agreement with those of other authors who employed noisy light-based coherent Raman scattering spectroscopy (I((2)) CARS). Another band at 1587 cm(-1) (nu(8) region of Py) has been observed for the first time by using Raman and IR spectroscopies. Its intensity shows the same dependence on the FA concentration and this fact allows us to also attribute it to an FA: Py adduct. The good relationship between the Raman and IR data demonstrates the potential of the vibrational spectroscopy for this kind of study. Owing to higher absolute Raman scattering cross section, the nu(1) region of Py has been chosen for the quantitative analysis and a stoichiometry of 1 : 1 FA: Py is reported. The experimental data are very well supported by the density functional theory (OFT) calculation, which was employed for the first time to the present system. Furthermore, the actual investigation shows an excellent agreement with those reported from computational calculations for similar systems. A comparison with our previous studies confirms that: the solvent dielectric constant determines the stoichiometry of a given Lewis acid-base adduct in the infinite dilution limit. Copyright (C) 2009 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two-dimensional and 3D quantitative structure-activity relationships studies were performed on a series of diarylpyridines that acts as cannabinoid receptor ligands by means of hologram quantitative structure-activity relationships and comparative molecular field analysis methods. The quantitative structure-activity relationships models were built using a data set of 52 CB1 ligands that can be used as anti-obesity agents. Significant correlation coefficients (hologram quantitative structure-activity relationships: r 2 = 0.91, q 2 = 0.78; comparative molecular field analysis: r 2 = 0.98, q 2 = 0.77) were obtained, indicating the potential of these 2D and 3D models for untested compounds. The models were then used to predict the potency of an external test set, and the predicted (calculated) values are in good agreement with the experimental results. The final quantitative structure-activity relationships models, along with the information obtained from 2D contribution maps and 3D contour maps, obtained in this study are useful tools for the design of novel CB1 ligands with improved anti-obesity potency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study is to evaluate the variation of solar radiation data between different data sources that will be free and available at the Solar Energy Research Center (SERC). The comparison between data sources will be carried out for two locations: Stockholm, Sweden and Athens, Greece. For the desired locations, data is gathered for different tilt angles: 0°, 30°, 45°, 60° facing south. The full dataset is available in two excel files: “Stockholm annual irradiation” and “Athens annual irradiation”. The World Radiation Data Center (WRDC) is defined as a reference for the comparison with other dtaasets, because it has the highest time span recorded for Stockholm (1964–2010) and Athens (1964–1986), in form of average monthly irradiation, expressed in kWh/m2. The indicator defined for the data comparison is the estimated standard deviation. The mean biased error (MBE) and the root mean square error (RMSE) were also used as statistical indicators for the horizontal solar irradiation data. The variation in solar irradiation data is categorized in two categories: natural or inter-annual variability, due to different data sources and lastly due to different calculation models. The inter-annual variation for Stockholm is 140.4kWh/m2 or 14.4% and 124.3kWh/m2 or 8.0% for Athens. The estimated deviation for horizontal solar irradiation is 3.7% for Stockholm and 4.4% Athens. This estimated deviation is respectively equal to 4.5% and 3.6% for Stockholm and Athens at 30° tilt, 5.2% and 4.5% at 45° tilt, 5.9% and 7.0% at 60°. NASA’s SSE, SAM and RETScreen (respectively Satel-light) exhibited the highest deviation from WRDC’s data for Stockholm (respectively Athens). The essential source for variation is notably the difference in horizontal solar irradiation. The variation increases by 1-2% per degree of tilt, using different calculation models, as used in PVSYST and Meteonorm. The location and altitude of the data source did not directly influence the variation with the WRDC data. Further examination is suggested in order to improve the methodology of selecting the location; Examining the functional dependence of ground reflected radiation with ambient temperature; variation of ambient temperature and its impact on different solar energy systems; Im pact of variation in solar irradiation and ambient temperature on system output.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The practitioners of bioinformatics require increasing sophistication from their software tools to take into account the particular characteristics that make their domain complex. For example, there is a great variation of experience of researchers, from novices who would like guidance from experts in the best resources to use to experts that wish to take greater management control of the tools used in their experiments. Also, the range of available, and conflicting, data formats is growing and there is a desire to automate the many trivial manual stages of in-silico experiments. Agent-oriented software development is one approach to tackling the design of complex applications. In this paper, we argue that, in fact, agent-oriented development is a particularly well-suited approach to developing bioinformatics tools that take into account the wider domain characteristics. To illustrate this, we design a data curation tool, which manages the format of experimental data, extend it to better account for the extra requirements placed by the domain characteristics, and show how the characteristics lead to a system well suited to an agent-oriented view.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The practitioners of bioinformatics require increasing sophistication from their software tools to take into account the particular characteristics that make their domain complex. For example, there is a great variation of experience of researchers, from novices who would like guidance from experts in the best resources to use to experts that wish to take greater management control of the tools used in their experiments. Also, the range of available, and conflicting, data formats is growing and there is a desire to automate the many trivial manual stages of in-silico experiments. Agent-oriented software development is one approach to tackling the design of complex applications. In this paper, we argue that, in fact, agent-oriented development is a particularly well-suited approach to developing bioinformatics tools that take into account the wider domain characteristics. To illustrate this, we design a data curation tool, which manages the format of experimental data, extend it to better account for the extra requirements placed by the domain characteristics, and show how the characteristics lead to a system well suited to an agent-oriented view.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In e-Science experiments, it is vital to record the experimental process for later use such as in interpreting results, verifying that the correct process took place or tracing where data came from. The process that led to some data is called the provenance of that data, and a provenance architecture is the software architecture for a system that will provide the necessary functionality to record, store and use process documentation. However, there has been little principled analysis of what is actually required of a provenance architecture, so it is impossible to determine the functionality they would ideally support. In this paper, we present use cases for a provenance architecture from current experiments in biology, chemistry, physics and computer science, and analyse the use cases to determine the technical requirements of a generic, technology and application-independent architecture. We propose an architecture that meets these requirements and evaluate a preliminary implementation by attempting to realise two of the use cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

From where did this tweet originate? Was this quote from the New York Times modified? Daily, we rely on data from the Web but often it is difficult or impossible to determine where it came from or how it was produced. This lack of provenance is particularly evident when people and systems deal with Web information or with any environment where information comes from sources of varying quality. Provenance is not captured pervasively in information systems. There are major technical, social, and economic impediments that stand in the way of using provenance effectively. This paper synthesizes requirements for provenance on the Web for a number of dimensions focusing on three key aspects of provenance: the content of provenance, the management of provenance records, and the uses of provenance information. To illustrate these requirements, we use three synthesized scenarios that encompass provenance problems faced by Web users today.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Short-term Water Information and Forecasting Tools (SWIFT) is a suite of tools for flood and short-term streamflow forecasting, consisting of a collection of hydrologic model components and utilities. Catchments are modeled using conceptual subareas and a node-link structure for channel routing. The tools comprise modules for calibration, model state updating, output error correction, ensemble runs and data assimilation. Given the combinatorial nature of the modelling experiments and the sub-daily time steps typically used for simulations, the volume of model configurations and time series data is substantial and its management is not trivial. SWIFT is currently used mostly for research purposes but has also been used operationally, with intersecting but significantly different requirements. Early versions of SWIFT used mostly ad-hoc text files handled via Fortran code, with limited use of netCDF for time series data. The configuration and data handling modules have since been redesigned. The model configuration now follows a design where the data model is decoupled from the on-disk persistence mechanism. For research purposes the preferred on-disk format is JSON, to leverage numerous software libraries in a variety of languages, while retaining the legacy option of custom tab-separated text formats when it is a preferred access arrangement for the researcher. By decoupling data model and data persistence, it is much easier to interchangeably use for instance relational databases to provide stricter provenance and audit trail capabilities in an operational flood forecasting context. For the time series data, given the volume and required throughput, text based formats are usually inadequate. A schema derived from CF conventions has been designed to efficiently handle time series for SWIFT.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper gives a first step toward a methodology to quantify the influences of regulation on short-run earnings dynamics. It also provides evidence on the patterns of wage adjustment adopted during the recent high inflationary experience in Brazil.The large variety of official wage indexation rules adopted in Brazil during the recent years combined with the availability of monthly surveys on labor markets makes the Brazilian case a good laboratory to test how regulation affects earnings dynamics. In particular, the combination of large sample sizes with the possibility of following the same worker through short periods of time allows to estimate the cross-sectional distribution of longitudinal statistics based on observed earnings (e.g., monthly and annual rates of change).The empirical strategy adopted here is to compare the distributions of longitudinal statistics extracted from actual earnings data with simulations generated from minimum adjustment requirements imposed by the Brazilian Wage Law. The analysis provides statistics on how binding were wage regulation schemes. The visual analysis of the distribution of wage adjustments proves useful to highlight stylized facts that may guide future empirical work.