15 resultados para Information Ethics 2012

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this paper is to give an overview of the issues and actions on the Brazilian cultural heritage and then to discuss contributions as well as relationships that may be established from the principles of Information Science. The first item is concerned with the relationship between heritage and the concept of document, the second relates the documentary processes and the information scientist and finally, an approach of cultural heritage mediation and appropriation is presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A favorable prognosis after tooth avulsion depends on some variables, such as the extra-alveolar period and storage medium. Vitality of the periodontal ligament cells is considered a critical factor for a successful outcome without root resorption. The dental surgeon is provided with clinical information and radiographic findings to establish a diagnosis and may rely on current available guidelines. Once trauma has occurred, treatment must be quick and effective, and periodic follow-up must be performed. Clinical, radiographic, and histologic characteristics for each type of root resorption due to tooth replantation are presented, with the aim to provide information for the diagnosis and treatment of healing complications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective. To compare the nutritional value of meals provided by companies participating in the Workers` Meal Program in the city of Sao Paulo, Brazil, to the nutritional recommendations and guidelines established by the Ministry of Health for the Brazilian population. Methods. The 72 companies studied were grouped according to economic sector (industrial, services, or commerce), size (micro, small, medium, or large), meal preparation modality (prepared on-site by the company itself, on-site by a hired caterer, or off-site by a hired caterer), and supervision by a dietitian (yes or no). The per capita amount of food was determined based on the lunch, dinner, and supper menus for three days. The nutritional value of the meals was defined by the amount of calories, carbohydrates, protein, total fat, polyunsaturated fat, saturated fat, trans fat, sugars, cholesterol, and fruits and vegetables. Results. Most of the menus were deficient in the number of fruits and vegetables (63.9%) and amount of polyunsaturated fat (83.3%), but high in total fat (47.2%) and cholesterol (62.5%). Group 2, composed of mostly medium and large companies, supervised by a dietician, belonging to the industrial and/or service sectors, and using a hired caterer, on averaged served meals with higher calorie content (P < 0.001), higher percentage of polyunsaturated fat (P < 0.001), more cholesterol (P = 0.015), and more fruits and vegetables (P < 0.001) than Group 1, which was composed of micro and small companies from the commercial sector, that prepare the meals themselves on-site, and are not supervised by a dietitian. Regarding the nutrition guidelines set for the Brazilian population, Group 2 meals were better in terms of fruit and vegetable servings (P < 0.001). Group I meals were better in terms of cholesterol content (P = 0.05). Conclusions. More specific action is required targeting company officers and managers in charge of food and nutrition services, especially in companies without dietitian supervision.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

IN BRAZIL, recent regulations require changes in private and public health systems to make special services available to deaf patients. in the present article, the researchers analyze the perceptions of 25 sign language using patients regarding this assistance. The researchers found communication difficulties between these patients and health services staff, as well as a culture clash and a harmful inability among the service providers to distinguish among the roles of companions, caretakers, and professional translator/interpreters. Thus, it became common for the patients to experience prejudice in the course of treatment and information exchange, damage to their autonomy, limits on their access to services, and reduced efficacy of therapy. The researchers conclude that many issues must be dealt with if such barriers to health access are to be overcome, in particular the worrying degree of exclusion of deaf patients from health care systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Astronomy has evolved almost exclusively by the use of spectroscopic and imaging techniques, operated separately. With the development of modern technologies, it is possible to obtain data cubes in which one combines both techniques simultaneously, producing images with spectral resolution. To extract information from them can be quite complex, and hence the development of new methods of data analysis is desirable. We present a method of analysis of data cube (data from single field observations, containing two spatial and one spectral dimension) that uses Principal Component Analysis (PCA) to express the data in the form of reduced dimensionality, facilitating efficient information extraction from very large data sets. PCA transforms the system of correlated coordinates into a system of uncorrelated coordinates ordered by principal components of decreasing variance. The new coordinates are referred to as eigenvectors, and the projections of the data on to these coordinates produce images we will call tomograms. The association of the tomograms (images) to eigenvectors (spectra) is important for the interpretation of both. The eigenvectors are mutually orthogonal, and this information is fundamental for their handling and interpretation. When the data cube shows objects that present uncorrelated physical phenomena, the eigenvector`s orthogonality may be instrumental in separating and identifying them. By handling eigenvectors and tomograms, one can enhance features, extract noise, compress data, extract spectra, etc. We applied the method, for illustration purpose only, to the central region of the low ionization nuclear emission region (LINER) galaxy NGC 4736, and demonstrate that it has a type 1 active nucleus, not known before. Furthermore, we show that it is displaced from the centre of its stellar bulge.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The integration of nanostructured films containing biomolecules and silicon-based technologies is a promising direction for reaching miniaturized biosensors that exhibit high sensitivity and selectivity. A challenge, however, is to avoid cross talk among sensing units in an array with multiple sensors located on a small area. In this letter, we describe an array of 16 sensing units, of a light-addressable potentiometric sensor (LAPS), which was made with layer-by-Layer (LbL) films of a poly(amidomine) dendrimer (PAMAM) and single-walled carbon nanotubes (SWNTs), coated with a layer of the enzyme penicillinase. A visual inspection of the data from constant-current measurements with liquid samples containing distinct concentrations of penicillin, glucose, or a buffer indicated a possible cross talk between units that contained penicillinase and those that did not. With the use of multidimensional data projection techniques, normally employed in information Visualization methods, we managed to distinguish the results from the modified LAPS, even in cases where the units were adjacent to each other. Furthermore, the plots generated with the interactive document map (IDMAP) projection technique enabled the distinction of the different concentrations of penicillin, from 5 mmol L(-1) down to 0.5 mmol L(-1). Data visualization also confirmed the enhanced performance of the sensing units containing carbon nanotubes, consistent with the analysis of results for LAPS sensors. The use of visual analytics, as with projection methods, may be essential to handle a large amount of data generated in multiple sensor arrays to achieve high performance in miniaturized systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For many learning tasks the duration of the data collection can be greater than the time scale for changes of the underlying data distribution. The question we ask is how to include the information that data are aging. Ad hoc methods to achieve this include the use of validity windows that prevent the learning machine from making inferences based on old data. This introduces the problem of how to define the size of validity windows. In this brief, a new adaptive Bayesian inspired algorithm is presented for learning drifting concepts. It uses the analogy of validity windows in an adaptive Bayesian way to incorporate changes in the data distribution over time. We apply a theoretical approach based on information geometry to the classification problem and measure its performance in simulations. The uncertainty about the appropriate size of the memory windows is dealt with in a Bayesian manner by integrating over the distribution of the adaptive window size. Thus, the posterior distribution of the weights may develop algebraic tails. The learning algorithm results from tracking the mean and variance of the posterior distribution of the weights. It was found that the algebraic tails of this posterior distribution give the learning algorithm the ability to cope with an evolving environment by permitting the escape from local traps.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider bipartitions of one-dimensional extended systems whose probability distribution functions describe stationary states of stochastic models. We define estimators of the information shared between the two subsystems. If the correlation length is finite, the estimators stay finite for large system sizes. If the correlation length diverges, so do the estimators. The definition of the estimators is inspired by information theory. We look at several models and compare the behaviors of the estimators in the finite-size scaling limit. Analytical and numerical methods as well as Monte Carlo simulations are used. We show how the finite-size scaling functions change for various phase transitions, including the case where one has conformal invariance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hypercycles are information integration systems which are thought to overcome the information crisis of prebiotic evolution by ensuring the coexistence of several short templates. For imperfect template replication, we derive a simple expression for the maximum number of distinct templates n(m). that can coexist in a hypercycle and show that it is a decreasing function of the length L of the templates. In the case of high replication accuracy we find that the product n(m)L tends to a constant value, limiting thus the information content of the hypercycle. Template coexistence is achieved either as a stationary equilibrium (stable fixed point) or a stable periodic orbit in which the total concentration of functional templates is nonzero. For the hypercycle system studied here we find numerical evidence that the existence of an unstable fixed point is a necessary condition for the presence of periodic orbits. (C) 2008 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The coexistence between different types of templates has been the choice solution to the information crisis of prebiotic evolution, triggered by the finding that a single RNA-like template cannot carry enough information to code for any useful replicase. In principle, confining d distinct templates of length L in a package or protocell, whose Survival depends on the coexistence of the templates it holds in, could resolve this crisis provided that d is made sufficiently large. Here we review the prototypical package model of Niesert et al. [1981. Origin of life between Scylla and Charybdis. J. Mol. Evol. 17, 348-353] which guarantees the greatest possible region of viability of the protocell population, and show that this model, and hence the entire package approach, does not resolve the information crisis. In particular, we show that the total information stored in a viable protocell (Ld) tends to a constant value that depends only on the spontaneous error rate per nucleotide of the template replication mechanism. As a result, an increase of d must be followed by a decrease of L, so that the net information gain is null. (C) 2008 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Burst firing is ubiquitous in nervous systems and has been intensively studied in central pattern generators (CPGs). Previous works have described subtle intraburst spike patterns (IBSPs) that, despite being traditionally neglected for their lack of relation to CPG motor function, were shown to be cell-type specific and sensitive to CPG connectivity. Here we address this matter by investigating how a bursting motor neuron expresses information about other neurons in the network. We performed experiments on the crustacean stomatogastric pyloric CPG, both in control conditions and interacting in real-time with computer model neurons. The sensitivity of postsynaptic to presynaptic IBSPs was inferred by computing their average mutual information along each neuron burst. We found that details of input patterns are nonlinearly and inhomogeneously coded through a single synapse into the fine IBSPs structure of the postsynaptic neuron following burst. In this way, motor neurons are able to use different time scales to convey two types of information simultaneously: muscle contraction (related to bursting rhythm) and the behavior of other CPG neurons (at a much shorter timescale by using IBSPs as information carriers). Moreover, the analysis revealed that the coding mechanism described takes part in a previously unsuspected information pathway from a CPG motor neuron to a nerve that projects to sensory brain areas, thus providing evidence of the general physiological role of information coding through IBSPs in the regulation of neuronal firing patterns in remote circuits by the CNS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

NMR quantum information processing studies rely on the reconstruction of the density matrix representing the so-called pseudo-pure states (PPS). An initially pure part of a PPS state undergoes unitary and non-unitary (relaxation) transformations during a computation process, causing a ""loss of purity"" until the equilibrium is reached. Besides, upon relaxation, the nuclear polarization varies in time, a fact which must be taken into account when comparing density matrices at different instants. Attempting to use time-fixed normalization procedures when relaxation is present, leads to various anomalies on matrices populations. On this paper we propose a method which takes into account the time-dependence of the normalization factor. From a generic form for the deviation density matrix an expression for the relaxing initial pure state is deduced. The method is exemplified with an experiment of relaxation of the concurrence of a pseudo-entangled state, which exhibits the phenomenon of sudden death, and the relaxation of the Wigner function of a pseudo-cat state.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Alzheimer`s disease is an ultimately fatal neurodegenerative disease, and BACE-1 has become an attractive validated target for its therapy, with more than a hundred crystal structures deposited in the PDB. In the present study, we present a new methodology that integrates ligand-based methods with structural information derived from the receptor. 128 BACE-1 inhibitors recently disclosed by GlaxoSmithKline R&D were selected specifically because the crystal structures of 9 of these compounds complexed to BACE-1, as well as five closely related analogs, have been made available. A new fragment-guided approach was designed to incorporate this wealth of structural information into a CoMFA study, and the methodology was systematically compared to other popular approaches, such as docking, for generating a molecular alignment. The influence of the partial charges calculation method was also analyzed. Several consistent and predictive models are reported, including one with r (2) = 0.88, q (2) = 0.69 and r (pred) (2) = 0.72. The models obtained with the new methodology performed consistently better than those obtained by other methodologies, particularly in terms of external predictive power. The visual analyses of the contour maps in the context of the enzyme drew attention to a number of possible opportunities for the development of analogs with improved potency. These results suggest that 3D-QSAR studies may benefit from the additional structural information added by the presented methodology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The design of translation invariant and locally defined binary image operators over large windows is made difficult by decreased statistical precision and increased training time. We present a complete framework for the application of stacked design, a recently proposed technique to create two-stage operators that circumvents that difficulty. We propose a novel algorithm, based on Information Theory, to find groups of pixels that should be used together to predict the Output Value. We employ this algorithm to automate the process of creating a set of first-level operators that are later combined in a global operator. We also propose a principled way to guide this combination, by using feature selection and model comparison. Experimental results Show that the proposed framework leads to better results than single stage design. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We discuss the connection between information and copula theories by showing that a copula can be employed to decompose the information content of a multivariate distribution into marginal and dependence components, with the latter quantified by the mutual information. We define the information excess as a measure of deviation from a maximum-entropy distribution. The idea of marginal invariant dependence measures is also discussed and used to show that empirical linear correlation underestimates the amplitude of the actual correlation in the case of non-Gaussian marginals. The mutual information is shown to provide an upper bound for the asymptotic empirical log-likelihood of a copula. An analytical expression for the information excess of T-copulas is provided, allowing for simple model identification within this family. We illustrate the framework in a financial data set. Copyright (C) EPLA, 2009