993 resultados para parameter-space graph


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many efforts are currently oriented toward extracting more information from ocean color than the chlorophyll a concentration. Among biological parameters potentially accessible from space, estimates of phytoplankton cell size and light absorption by colored detrital matter (CDM) would lead to an indirect assessment of major components of the organic carbon pool in the ocean, which would benefit oceanic carbon budget models. We present here 2 procedures to retrieve simultaneously from ocean color measurements in a limited number of bands, magnitudes, and spectral shapes for both light absorption by CDM and phytoplankton, along with a size parameter for phytoplankton. The performance of the 2 procedures was evaluated using different data sets that correspond to increasing uncertainties: ( 1) measured absorption coefficients of phytoplankton, particulate detritus, and colored dissolved organic matter ( CDOM) and measured chlorophyll a concentrations and ( 2) SeaWiFS upwelling radiance measurements and chlorophyll a concentrations estimated from global algorithms. In situ data were acquired during 3 cruises, differing by their relative proportions in CDM and phytoplankton, over a continental shelf off Brazil. No local information was introduced in either procedure, to make them more generally applicable. Over the study area, the absorption coefficient of CDM at 443 nm was retrieved from SeaWiFS radiances with a relative root mean square error (RMSE) of 33%, and phytoplankton light absorption coefficients in SeaWiFS bands ( from 412 to 510 nm) were retrieved with RMSEs between 28% and 33%. These results are comparable to or better than those obtained by 3 published models. In addition, a size parameter of phytoplankton and the spectral slope of CDM absorption were retrieved with RMSEs of 17% and 22%, respectively. If these methods are applied at a regional scale, the performances could be substantially improved by locally tuning some empirical relationships.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In questo elaborato ci siamo occupati della legge di Zipf sia da un punto di vista applicativo che teorico. Tale legge empirica afferma che il rango in frequenza (RF) delle parole di un testo seguono una legge a potenza con esponente -1. Per quanto riguarda l'approccio teorico abbiamo trattato due classi di modelli in grado di ricreare leggi a potenza nella loro distribuzione di probabilità. In particolare, abbiamo considerato delle generalizzazioni delle urne di Polya e i processi SSR (Sample Space Reducing). Di questi ultimi abbiamo dato una formalizzazione in termini di catene di Markov. Infine abbiamo proposto un modello di dinamica delle popolazioni capace di unificare e riprodurre i risultati dei tre SSR presenti in letteratura. Successivamente siamo passati all'analisi quantitativa dell'andamento del RF sulle parole di un corpus di testi. Infatti in questo caso si osserva che la RF non segue una pura legge a potenza ma ha un duplice andamento che può essere rappresentato da una legge a potenza che cambia esponente. Abbiamo cercato di capire se fosse possibile legare l'analisi dell'andamento del RF con le proprietà topologiche di un grafo. In particolare, a partire da un corpus di testi abbiamo costruito una rete di adiacenza dove ogni parola era collegata tramite un link alla parola successiva. Svolgendo un'analisi topologica della struttura del grafo abbiamo trovato alcuni risultati che sembrano confermare l'ipotesi che la sua struttura sia legata al cambiamento di pendenza della RF. Questo risultato può portare ad alcuni sviluppi nell'ambito dello studio del linguaggio e della mente umana. Inoltre, siccome la struttura del grafo presenterebbe alcune componenti che raggruppano parole in base al loro significato, un approfondimento di questo studio potrebbe condurre ad alcuni sviluppi nell'ambito della comprensione automatica del testo (text mining).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Poset associahedra are a family of convex polytopes recently introduced by Pavel Galashin in 2021. The associahedron An is an (n-2)-dimensional convex polytope whose facial structure encodes the ways of parenthesizing an n-letter word (among several equivalent combinatorial objects). Associahedra are deeply studied polytopes that appear naturally in many areas of mathematics: algebra, combinatorics, geometry, topology... They have many presentations and generalizations. One of their incarnations is as a compactification of the configuration space of n points on a line. Similarly, the P-associahedron of a poset P is a compactification of the configuration space of order preserving maps from P to R. Galashin presents poset associahedra as combinatorial objects and shows that they can be realized as convex polytopes. However, his proof is not constructive, in the sense that no explicit coordinates are provided. The main goal of this thesis is to provide an explicit construction of poset associahedra as sections of graph associahedra, thus solving the open problem stated in Remark 1.5 of Galashin's paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study of the user scheduling problem in a Low Earth Orbit (LEO) Multi-User MIMO system is the objective of this thesis. With the application of cutting-edge digital beamforming algorithms, a LEO satellite with an antenna array and a large number of antenna elements can provide service to many user terminals (UTs) in full frequency reuse (FFR) schemes. Since the number of UTs on-ground are many more than the transmit antennas on the satellite, user scheduling is necessary. Scheduling can be accomplished by grouping users into different clusters: users within the same cluster are multiplexed and served together via Space Division Multiple Access (SDMA), i.e., digital beamforming or Multi-User MIMO techniques; the different clusters of users are then served on different time slots via Time Division Multiple Access (TDMA). The design of an optimal user grouping strategy is known to be an NP-complete problem which can be solved only through exhaustive search. In this thesis, we provide a graph-based user scheduling and feed space beamforming architecture for the downlink with the aim of reducing user inter-beam interference. The main idea is based on clustering users whose pairwise great-circle distance is as large as possible. First, we create a graph where the users represent the vertices, whereas an edge in the graph between 2 users exists if their great-circle distance is above a certain threshold. In the second step, we develop a low complex greedy user clustering technique and we iteratively search for the maximum clique in the graph, i.e., the largest fully connected subgraph in the graph. Finally, by using the 3 aforementioned power normalization techniques, a Minimum Mean Square Error (MMSE) beamforming matrix is deployed on a cluster basis. The suggested scheduling system is compared with a position-based scheduler, which generates a beam lattice on the ground and randomly selects one user per beam to form a cluster.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study evaluated the sealing ability of different lengths of remaining root canal filling and post space preparation against coronal leakage of Enterococcus faecalis. Forty-one roots of maxillary incisors were biomechanically prepared, maintaining standardized canal diameter at the middle and coronal thirds. The roots were autoclaved and all subsequent steps were undertaken in a laminar flow chamber. The canals of 33 roots were obturated with AH Plus sealer and gutta-percha. The root canal fillings were reduced to 3 predetermined lengths (n=11): G1=6 mm, G2=4 mm and G3=2 mm. The remaining roots served as positive and negative controls. Bacterial leakage test apparatuses were fabricated with the roots attached to Eppendorf tubes keeping 2 mm of apex submerged in BHI in glass flasks. The specimens received an E. faecalis inoculum of 1 x 107 cfu/mL every 3 days and were observed for bacterial leakage daily during 60 days. Data were submitted to ANOVA, Tukey's test and Fisher's test. At 60 days, G1 (6 mm) and G2 (4 mm) presented statistically similar results (p>0.05) (54.4% of specimens with bacterial leakage) and both groups differed significantly (p<0.01) from G3 (2 mm), which presented 100% of specimens with E. faecalis leakage. It may be concluded that the shortest endodontic obturation remnant leaked considerably more than the other lengths, although none of the tested conditions avoids coronal leakage of E. faecalis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJETIVO: Analisar a acurácia do diagnóstico de dois protocolos de imunofluorescência indireta para leishmaniose visceral canina. MÉTODOS: Cães provenientes de inquérito soroepidemiológico realizado em área endêmica nos municípios de Araçatuba e de Andradina, na região noroeste do estado de São Paulo, em 2003, e área não endêmica da região metropolitana de São Paulo, foram utilizados para avaliar comparativamente dois protocolos da reação de imunofluorescência indireta (RIFI) para leishmaniose: um utilizando antígeno heterólogo Leishmania major (RIFI-BM) e outro utilizando antígeno homólogo Leishmania chagasi (RIFI-CH). Para estimar acurácia utilizou-se a análise two-graph receiver operating characteristic (TG-ROC). A análise TG-ROC comparou as leituras da diluição 1:20 do antígeno homólogo (RIFI-CH), consideradas como teste referência, com as diluições da RIFI-BM (antígeno heterólogo). RESULTADOS: A diluição 1:20 do teste RIFI-CH apresentou o melhor coeficiente de contingência (0,755) e a maior força de associação entre as duas variáveis estudadas (qui-quadrado=124,3), sendo considerada a diluição-referência do teste nas comparações com as diferentes diluições do teste RIFI-BM. Os melhores resultados do RIFI-BM foram obtidos na diluição 1:40, com melhor coeficiente de contingência (0,680) e maior força de associação (qui-quadrado=80,8). Com a mudança do ponto de corte sugerido nesta análise para a diluição 1:40 da RIFI-BM, o valor do parâmetro especificidade aumentou de 57,5% para 97,7%, embora a diluição 1:80 tivesse apresentado a melhor estimativa para sensibilidade (80,2%) com o novo ponto de corte. CONCLUSÕES: A análise TG-ROC pode fornecer importantes informações sobre os testes de diagnósticos, além de apresentar sugestões sobre pontos de cortes que podem melhorar as estimativas de sensibilidade e especificidade do teste, e avaliá-los a luz do melhor custo-benefício.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using series solutions and time-domain evolutions, we probe the eikonal limit of the gravitational and scalar-field quasinormal modes of large black holes and black branes in anti-de Sitter backgrounds. These results are particularly relevant for the AdS/CFT correspondence, since the eikonal regime is characterized by the existence of long-lived modes which (presumably) dominate the decay time scale of the perturbations. We confirm all the main qualitative features of these slowly damped modes as predicted by Festuccia and Liu [G. Festuccia and H. Liu, arXiv:0811.1033.] for the scalar-field (tensor-type gravitational) fluctuations. However, quantitatively we find dimensional-dependent correction factors. We also investigate the dependence of the quasinormal mode frequencies on the horizon radius of the black hole (brane) and the angular momentum (wave number) of vector- and scalar-type gravitational perturbations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The synchronizing properties of two diffusively coupled hyperchaotic Lorenz 4D systems are investigated by calculating the transverse Lyapunov exponents and by observing the phase space trajectories near the synchronization hyperplane. The effect of parameter mismatch is also observed. A simple electrical circuit described by the Lorenz 4D equations is proposed. Some results from laboratory experiments with two coupled circuits are presented. Copyright (C) 2009 Ruy Barboza.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Consider N sites randomly and uniformly distributed in a d-dimensional hypercube. A walker explores this disordered medium going to the nearest site, which has not been visited in the last mu (memory) steps. The walker trajectory is composed of a transient part and a periodic part (cycle). For one-dimensional systems, travelers can or cannot explore all available space, giving rise to a crossover between localized and extended regimes at the critical memory mu(1) = log(2) N. The deterministic rule can be softened to consider more realistic situations with the inclusion of a stochastic parameter T (temperature). In this case, the walker movement is driven by a probability density function parameterized by T and a cost function. The cost function increases as the distance between two sites and favors hops to closer sites. As the temperature increases, the walker can escape from cycles that are reminiscent of the deterministic nature and extend the exploration. Here, we report an analytical model and numerical studies of the influence of the temperature and the critical memory in the exploration of one-dimensional disordered systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

SEVERAL MODELS OF TIME ESTIMATION HAVE BEEN developed in psychology; a few have been applied to music. In the present study, we assess the influence of the distances travelled through pitch space on retrospective time estimation. Participants listened to an isochronous chord sequence of 20-s duration. They were unexpectedly asked to reproduce the time interval of the sequence. The harmonic structure of the stimulus was manipulated so that the sequence either remained in the same key (CC) or travelled through a closely related key (CFC) or distant key (CGbC). Estimated times were shortened when the sequence modulated to a very distant key. This finding is discussed in light of Lerdahl's Tonal Pitch Space Theory (2001), Firmino and Bueno's Expected Development Fraction Model (in press), and models of time estimation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Certain areas of the city of Sao Paulo, as many others around the world, including in Lisbon, Barcelona and Buenos Aires, have been going through a process of requalification, in special the ones commonly known as old and/or traditional city. Regarding Sao Paulo, some exceptional actions have been taken downtown with investments in rehabilitation/requalification of areas that concentrated the historical, urbanistic and cultural heritages, such as Praca da S and its cathedral, as well as the revaluation/rehabilitation projects of other squares like Praca da Republica, other areas as the previously called Cracolandia (due to high consumption/deal of crack), known today as Nova Luz, besides propositions to reevaluate areas already modified, such as Vale do Anhangabau. In all propositions to modify sites, it is firstly underlined its deterioration, litter and the presence of low-income populations (passer-bys, street vendors or residents), generally stigmatized as ""potential suspects"", emphasizing danger and lack of security in those places. This belief, which has become consensual, results in that: public as well as private companies promote the rehabilitation of the areas basing their reasoning in the necessity of adding value to the existing urban heritage, although, as it will be discussed in this paper, part of this heritage might be destroyed in this very process, under the allegation that upon completion, the action would allow the social, cultural and economical revaluation/requalification of the area. This paper is intended to provide a contribution to this discussion.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Much of social science literature about South African cities fails to represent its complex spectrum of sexual practices and associated identities. The unintended effects of such representations are that a compulsory heterosexuality is naturalised in, and reiterative with, dominant constructions of blackness in townships. In this paper, we argue that the assertion of discreet lesbian and gay identities in black townships of a South African city such as Cape Town is influenced by the historical racial and socio-economic divides that have marked urban landscape. In their efforts to recoup a positive sense of gendered personhood, residents have constructed a moral economy anchored in reproductive heterosexuality. We draw upon ethnographic data to show how sexual minorities live their lives vicariously in spaces they have prised open within the extant sex/gender binary. They are able to assert the identities of moffie and man-vrou (mannish woman) without threatening the dominant ideology of heterosexuality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Creation of cold dark matter (CCDM) can macroscopically be described by a negative pressure, and, therefore, the mechanism is capable to accelerate the Universe, without the need of an additional dark energy component. In this framework, we discuss the evolution of perturbations by considering a Neo-Newtonian approach where, unlike in the standard Newtonian cosmology, the fluid pressure is taken into account even in the homogeneous and isotropic background equations (Lima, Zanchin, and Brandenberger, MNRAS 291, L1, 1997). The evolution of the density contrast is calculated in the linear approximation and compared to the one predicted by the Lambda CDM model. The difference between the CCDM and Lambda CDM predictions at the perturbative level is quantified by using three different statistical methods, namely: a simple chi(2)-analysis in the relevant space parameter, a Bayesian statistical inference, and, finally, a Kolmogorov-Smirnov test. We find that under certain circumstances, the CCDM scenario analyzed here predicts an overall dynamics (including Hubble flow and matter fluctuation field) which fully recovers that of the traditional cosmic concordance model. Our basic conclusion is that such a reduction of the dark sector provides a viable alternative description to the accelerating Lambda CDM cosmology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new age-redshift test is proposed in order to constrain H(0) on the basis of the existence of old high-redshift galaxies (OHRGs). In the flat Lambda cold dark matter model, the value of H(0) is heavily dependent on the mass density parameter Omega(M) = 1- Omega(Lambda). Such a degeneracy can be broken through a joint analysis involving the OHRG and baryon acoustic oscillation signature. By assuming a galaxy incubation time, t(inc) = 0.8 +/- 0.4 Gyr, our joint analysis yields a value of H(0) = 71 +/- 4 km s(-1) Mpc(-1) (1 sigma) with the best-fit density parameter Omega(M) = 0.27 +/- 0.03. Such results are in good agreement with independent studies from the Hubble Space Telescope key project and recent estimates of the Wilkinson Microwave Anisotropy Probe, thereby suggesting that the combination of these two independent phenomena provides an interesting method to constrain the Hubble constant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A component of dark energy has been recently proposed to explain the current acceleration of the Universe. Unless some unknown symmetry in Nature prevents or suppresses it, such a field may interact with the pressureless component of dark matter, giving rise to the so-called models of coupled quintessence. In this paper we propose a new cosmological scenario where radiation and baryons are conserved, while the dark energy component is decaying into cold dark matter. The dilution of cold dark matter particles, attenuated with respect to the usual a(-3) scaling due to the interacting process, is characterized by a positive parameter epsilon, whereas the dark energy satisfies the equation of state p(x) = omega rho(x) (omega < 0). We carry out a joint statistical analysis involving recent observations from type Ia supernovae, baryon acoustic oscillation peak, and cosmic microwave background shift parameter to check the observational viability of the coupled quintessence scenario here proposed.