204 resultados para Digital systems
Resumo:
Background: High-density tiling arrays and new sequencing technologies are generating rapidly increasing volumes of transcriptome and protein-DNA interaction data. Visualization and exploration of this data is critical to understanding the regulatory logic encoded in the genome by which the cell dynamically affects its physiology and interacts with its environment. Results: The Gaggle Genome Browser is a cross-platform desktop program for interactively visualizing high-throughput data in the context of the genome. Important features include dynamic panning and zooming, keyword search and open interoperability through the Gaggle framework. Users may bookmark locations on the genome with descriptive annotations and share these bookmarks with other users. The program handles large sets of user-generated data using an in-process database and leverages the facilities of SQL and the R environment for importing and manipulating data. A key aspect of the Gaggle Genome Browser is interoperability. By connecting to the Gaggle framework, the genome browser joins a suite of interconnected bioinformatics tools for analysis and visualization with connectivity to major public repositories of sequences, interactions and pathways. To this flexible environment for exploring and combining data, the Gaggle Genome Browser adds the ability to visualize diverse types of data in relation to its coordinates on the genome. Conclusions: Genomic coordinates function as a common key by which disparate biological data types can be related to one another. In the Gaggle Genome Browser, heterogeneous data are joined by their location on the genome to create information-rich visualizations yielding insight into genome organization, transcription and its regulation and, ultimately, a better understanding of the mechanisms that enable the cell to dynamically respond to its environment.
Resumo:
Aims. We create a catalogue of simulated fossil groups and study their properties, in particular the merging histories of their first-ranked galaxies. We compare the simulated fossil group properties with those of both simulated non-fossil and observed fossil groups. Methods. Using simulations and a mock galaxy catalogue, we searched for massive (>5 x 10(13) h(-1) M-circle dot) fossil groups in the Millennium Simulation Galaxy Catalogue. In addition, we attempted to identify observed fossil groups in the Sloan Digital Sky Survey Data Release 6 using identical selection criteria. Results. Our predictions on the basis of the simulation data are: (a) fossil groups comprise about 5.5% of the total population of groups/clusters with masses larger than 5 x 10(13) h(-1) M-circle dot. This fraction is consistent with the fraction of fossil groups identified in the SDSS, after all observational biases have been taken into account; (b) about 88% of the dominant central objects in fossil groups are elliptical galaxies that have a median R-band absolute magnitude of similar to-23.5-5 log h, which is typical of the observed fossil groups known in the literature; (c) first-ranked galaxies of systems with M > 5 x 10(13) h(-1) M-circle dot, regardless of whether they are either fossil or non-fossil, are mainly formed by gas-poor mergers; (d) although fossil groups, in general, assembled most of their virial masses at higher redshifts in comparison with non-fossil groups, first-ranked galaxies in fossil groups merged later, i.e. at lower redshifts, compared with their non-fossil-group counterparts. Conclusions. We therefore expect to observe a number of luminous galaxies in the centres of fossil groups that show signs of a recent major merger.
Resumo:
Context. Fossil systems are defined to be X- ray bright galaxy groups ( or clusters) with a two- magnitude difference between their two brightest galaxies within half the projected virial radius, and represent an interesting extreme of the population of galaxy agglomerations. However, the physical conditions and processes leading to their formation are still poorly constrained. Aims. We compare the outskirts of fossil systems with that of normal groups to understand whether environmental conditions play a significant role in their formation. We study the groups of galaxies in both, numerical simulations and observations. Methods. We use a variety of statistical tools including the spatial cross- correlation function and the local density parameter Delta(5) to probe differences in the density and structure of the environments of "" normal"" and "" fossil"" systems in the Millennium simulation. Results. We find that the number density of galaxies surrounding fossil systems evolves from greater than that observed around normal systems at z = 0.69, to lower than the normal systems by z = 0. Both fossil and normal systems exhibit an increment in their otherwise radially declining local density measure (Delta(5)) at distances of order 2.5 r(vir) from the system centre. We show that this increment is more noticeable for fossil systems than normal systems and demonstrate that this difference is linked to the earlier formation epoch of fossil groups. Despite the importance of the assembly time, we show that the environment is different for fossil and non- fossil systems with similar masses and formation times along their evolution. We also confirm that the physical characteristics identified in the Millennium simulation can also be detected in SDSS observations. Conclusions. Our results confirm the commonly held belief that fossil systems assembled earlier than normal systems but also show that the surroundings of fossil groups could be responsible for the formation of their large magnitude gap.
Resumo:
We present results from the PARallaxes of Southern Extremely Cool objects ( PARSEC) program, an observational program begun in 2007 April to determine parallaxes for 122 L and 28 T southern hemisphere dwarfs using the Wide Field Imager on the ESO 2.2 m telescope. The results presented here include parallaxes of 10 targets from observations over 18 months and a first version proper motion catalog. The proper motions were obtained by combining PARSEC observations astrometrically reduced with respect to the Second US Naval Observatory CCD Astrograph Catalog, and the Two Micron All Sky Survey Point Source Catalog. The resulting median proper motion precision is 5 mas yr(-1) for 195,700 sources. The 140 0.3 deg(2) fields sample the southern hemisphere in an unbiased fashion with the exception of the galactic plane due to the small number of targets in that region. The proper motion distributions are shown to be statistically well behaved. External comparisons are also fully consistent. We will continue to update this catalog until the end of the program, and we plan to improve it including also observations from the GSC2.3 database. We present preliminary parallaxes with a 4.2 mas median precision for 10 brown dwarfs, two of which are within 10 pc. These increase the present number of L dwarfs by 20% with published parallaxes. Of the 10 targets, seven have been previously discussed in the literature: two were thought to be binary, but the PARSEC observations show them to be single; one has been confirmed as a binary companion and another has been found to be part of a binary system, both of which will make good benchmark systems. These results confirm that the foreseen precision of PARSEC can be achieved and that the large field of view will allow us to identify wide binary systems. Observations for the PARSEC program will end in early 2011 providing three to four years of coverage for all targets. The main expected outputs are: more than a 100% increase in the number of L dwarfs with parallaxes, increment in the number of objects per spectral subclass up to L9-in conjunction with published results-to at least 10, and to put sensible limits on the general binary fraction of brown dwarfs. We aim to contribute significantly to the understanding of the faint end of the H-R diagram and of the L/T transition region.
Resumo:
Context. It was proposed earlier that the relativistic ejections observed in microquasars could be produced by violent magnetic reconnection episodes at the inner disk coronal region (de Gouveia Dal Pino & Lazarian 2005). Aims. Here we revisit this model, which employs a standard accretion disk description and fast magnetic reconnection theory, and discuss the role of magnetic reconnection and associated heating and particle acceleration in different jet/disk accretion systems, namely young stellar objects (YSOs), microquasars, and active galactic nuclei (AGNs). Methods. In microquasars and AGNs, violent reconnection episodes between the magnetic field lines of the inner disk region and those that are anchored in the black hole are able to heat the coronal/disk gas and accelerate the plasma to relativistic velocities through a diffusive first-order Fermi-like process within the reconnection site that will produce intermittent relativistic ejections or plasmons. Results. The resulting power-law electron distribution is compatible with the synchrotron radio spectrum observed during the outbursts of these sources. A diagram of the magnetic energy rate released by violent reconnection as a function of the black hole (BH) mass spanning 10(9) orders of magnitude shows that the magnetic reconnection power is more than sufficient to explain the observed radio luminosities of the outbursts from microquasars to low luminous AGNs. In addition, the magnetic reconnection events cause the heating of the coronal gas, which can be conducted back to the disk to enhance its thermal soft X-ray emission as observed during outbursts in microquasars. The decay of the hard X-ray emission right after a radio flare could also be explained in this model due to the escape of relativistic electrons with the evolving jet outburst. In the case of YSOs a similar magnetic configuration can be reached that could possibly produce observed X-ray flares in some sources and provide the heating at the jet launching base, but only if violent magnetic reconnection events occur with episodic, very short-duration accretion rates which are similar to 100-1000 times larger than the typical average accretion rates expected for more evolved (T Tauri) YSOs.
Resumo:
In geophysics and seismology, raw data need to be processed to generate useful information that can be turned into knowledge by researchers. The number of sensors that are acquiring raw data is increasing rapidly. Without good data management systems, more time can be spent in querying and preparing datasets for analyses than in acquiring raw data. Also, a lot of good quality data acquired at great effort can be lost forever if they are not correctly stored. Local and international cooperation will probably be reduced, and a lot of data will never become scientific knowledge. For this reason, the Seismological Laboratory of the Institute of Astronomy, Geophysics and Atmospheric Sciences at the University of Sao Paulo (IAG-USP) has concentrated fully on its data management system. This report describes the efforts of the IAG-USP to set up a seismology data management system to facilitate local and international cooperation.
Resumo:
We report the discovery of seven new, very bright gravitational lens systems from our ongoing gravitational lens search, the Sloan Bright Arcs Survey (SBAS). Two of the systems are confirmed to have high source redshifts z = 2.19 and z = 2.94. Three other systems lie at intermediate redshift with z = 1.33, 1.82, 1.93 and two systems are at low redshift z = 0.66, 0.86. The lensed source galaxies in all of these systems are bright, with i-band magnitudes ranging from 19.73 to 22.06. We present the spectrum of each of the source galaxies in these systems along with estimates of the Einstein radius for each system. The foreground lens in most systems is identified by a red sequence based cluster finder as a galaxy group; one system is identified as a moderately rich cluster. In total, SBAS has now discovered 19 strong lens systems in the SDSS imaging data, 8 of which are among the highest surface brightness z similar or equal to 2-3 galaxies known.
Resumo:
Several experimental studies have altered the phase relationship between photic and non-photic environmental, 24 h cycles (zeitgebers) in order to assess their role in the synchronization of circadian rhythms. To assist in the interpretation of the complex activity patterns that emerge from these ""conflicting zeitgeber'' protocols, we present computer simulations of coupled circadian oscillators forced by two independent zeitgebers. This circadian system configuration was first employed by Pittendrigh and Bruce (1959), to model their studies of the light and temperature entrainment of the eclosion oscillator in Drosophila. Whereas most of the recent experiments have restricted conflicting zeitgeber experiments to two experimental conditions, by comparing circadian oscillator phases under two distinct phase relationships between zeitgebers (usually 0 and 12 h), Pittendrigh and Bruce compared eclosion phase under 12 distinct phase relationships, spanning the 24 h interval. Our simulations using non-linear differential equations replicated complex non-linear phenomena, such as ""phase jumps'' and sudden switches in zeitgeber preferences, which had previously been difficult to interpret. Our simulations reveal that these phenomena generally arise when inter-oscillator coupling is high in relation to the zeitgeber strength. Manipulations in the structural symmetry of the model indicated that these results can be expected to apply to a wide range of system configurations. Finally, our studies recommend the use of the complete protocol employed by Pittendrigh and Bruce, because different system configurations can generate similar results when a ""conflicting zeitgeber experiment'' incorporates only two phase relationships between zeitgebers.
Resumo:
We study the existence of positive solutions of Hamiltonian-type systems of second-order elliptic PDE in the whole space. The systems depend on a small parameter and involve a potential having a global well structure. We use dual variational methods, a mountain-pass type approach and Fourier analysis to prove positive solutions exist for sufficiently small values of the parameter.
Resumo:
This paper studies semistability of the recursive Kalman filter in the context of linear time-varying (LTV), possibly nondetectable systems with incorrect noise information. Semistability is a key property, as it ensures that the actual estimation error does not diverge exponentially. We explore structural properties of the filter to obtain a necessary and sufficient condition for the filter to be semistable. The condition does not involve limiting gains nor the solution of Riccati equations, as they can be difficult to obtain numerically and may not exist. We also compare semistability with the notions of stability and stability w.r.t. the initial error covariance, and we show that semistability in a sense makes no distinction between persistent and nonpersistent incorrect noise models, as opposed to stability. In the linear time invariant scenario we obtain algebraic, easy to test conditions for semistability and stability, which complement results available in the context of detectable systems. Illustrative examples are included.
Resumo:
This paper studies a nonlinear, discrete-time matrix system arising in the stability analysis of Kalman filters. These systems present an internal coupling between the state components that gives rise to complex dynamic behavior. The problem of partial stability, which requires that a specific component of the state of the system converge exponentially, is studied and solved. The convergent state component is strongly linked with the behavior of Kalman filters, since it can be used to provide bounds for the error covariance matrix under uncertainties in the noise measurements. We exploit the special features of the system-mainly the connections with linear systems-to obtain an algebraic test for partial stability. Finally, motivated by applications in which polynomial divergence of the estimates is acceptable, we study and solve a partial semistability problem.
Resumo:
This article evaluates social implications of the ""SIGA"" Health Care Information System (HIS) in a public health care organization in the city of Sao Paulo. The evaluation was performed by means of an in-depth case study with patients and staff of a public health care organization, using qualitative and quantitative data. On the one hand, the system had consequences perceived as positive such as improved convenience and democratization of specialized treatment for patients and improvements in work organization. On the other hand, negative outcomes were reported, like difficulties faced by employees due to little familiarity with IT and an increase in the time needed to schedule appointments. Results show the ambiguity of the implications of HIS in developing countries, emphasizing the need for a more nuanced view of the evaluation of failures and successes and the importance of social contextual factors.
Resumo:
We analyze the irreversibility and the entropy production in nonequilibrium interacting particle systems described by a Fokker-Planck equation by the use of a suitable master equation representation. The irreversible character is provided either by nonconservative forces or by the contact with heat baths at distinct temperatures. The expression for the entropy production is deduced from a general definition, which is related to the probability of a trajectory in phase space and its time reversal, that makes no reference a priori to the dissipated power. Our formalism is applied to calculate the heat conductance in a simple system consisting of two Brownian particles each one in contact to a heat reservoir. We show also the connection between the definition of entropy production rate and the Jarzynski equality.
Resumo:
The structure of probability currents is studied for the dynamical network after consecutive contraction on two-state, nonequilibrium lattice systems. This procedure allows us to investigate the transition rates between configurations on small clusters and highlights some relevant effects of lattice symmetries on the elementary transitions that are responsible for entropy production. A method is suggested to estimate the entropy production for different levels of approximations (cluster sizes) as demonstrated in the two-dimensional contact process with mutation.
Resumo:
We show a function that fits well the probability density of return times between two consecutive visits of a chaotic trajectory to finite size regions in phase space. It deviates from the exponential statistics by a small power-law term, a term that represents the deterministic manifestation of the dynamics. We also show how one can quickly and easily estimate the Kolmogorov-Sinai entropy and the short-term correlation function by realizing observations of high probable returns. Our analyses are performed numerically in the Henon map and experimentally in a Chua's circuit. Finally, we discuss how our approach can be used to treat the data coming from experimental complex systems and for technological applications. (C) 2009 American Institute of Physics. [doi: 10.1063/1.3263943]