872 resultados para Large-scale experiments


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

90.00% 90.00%

Publicador:

Resumo:

O presente documento atualiza os dados que revelam o perfil atual do ensino fundamental e médio no Brasil, re-visitando os bancos de dados de instituições oficiais (INEP e IBGE). Nós encontramos coincidências suspeitas que relacionam concentração de renda e abstinência educacional, como combustíveis que realimentam o ciclo da miséria em nosso país. Aspectos qualitativos da escola brasileira também foram levantados para que pudéssemos compreender as dificuldades do fazer educacional no Brasil e acenar com possibilidades concretas de mudança. Nós também investigamos, através da metodologia de estudo de caso, dois cursos de férias para implementar e avaliar a Aprendizagem Baseada em Problemas (ABP). As aulas foram direcionadas a professores e alunos do Ensino Médio. Através da observação direta e de inferências obtidas a partir de questionários aplicados a alunos e professores, antes e depois de cada um dos cursos (em 2004 e 2005), mediu-se o impacto da ABP sobre alunos e professores. Os resultados revelaram várias dificuldades e perplexidades demonstradas por professores e alunos, tais como a dificuldade em relacionar experimentos e conteúdos dos livros didáticos, a abstinência quase completa em experimentação e sua relação com o Método Científico, dificuldades de discernir entre hipótese e fato, etc. A Aprendizagem Baseada em Problemas foi aceita por todos (estudantes e professores) como uma possível maneira de mudar as aulas de Ciências e a Biologia. Contudo, sua ampla disseminação vai exigir capacitação em larga escala, gestão e liderança para iniciar o processo de mudança, aumentos de salários, e melhor infra-estrutura das escolas para experimentação. Utilizando metodologias semelhantes à ABP, como alternativa para capacitação, as universidades que desenvolvem atividades de pesquisa precisam ser diretamente envolvidas no processo através de fomento dirigido para a renovação. Um pacto educacional precisa ser construído para reformar o fazer educacional, e essa ação deve incluir Administradores e Professores das escolas de ensino fundamental e médio, Secretários de Educação (do Estado e do Município), Governos de Estado e do Município, Ministérios da Educação e da Ciência e Tecnologia, Universidades e Agências de Fomento para que o esforço possa ganhar dimensões nacionais.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Em muitos problemas de otimização há dificuldades em alcançar um resultado ótimo ou mesmo um resultado próximo ao valor ótimo em um tempo viável, principalmente quando se trabalha em grande escala. Por isso muitos desses problemas são abordados por heurísticas ou metaheurísticas que executam buscas por melhores soluções dentro do espaço de busca definido. Dentro da computação natural estão os Algoritmos Culturais e os Algoritmos Genéticos, que são considerados metaheurísticas evolutivas que se complementam devido ao mecanismo dual de herança cultura/genética. A proposta do presente trabalho é estudar e utilizar tais mecanismos acrescentando tanto heurísticas de busca local como multipopulações aplicados em problemas de otimização combinatória (caixeiro viajante e mochila), funções multimodais e em problemas restritos. Serão executados alguns experimentos para efetuar uma avaliação em relação ao desempenho desses mecanismos híbridos e multipopulacionais com outros mecanismos dispostos na literatura de acordo com cada problema de otimização aqui abordado.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Climate change and its consequences seem to be increasingly evident in our daily lives. However, is it possible for students to identify a relationship between these large-scale events and the chemistry taught in the classroom? The aim of the present work is to demonstrate that chemistry can assist in elucidating important environmental issues. Simple experiments are used to demonstrate the mechanism of cloud formation, as well as the influence of anthropogenic and natural emissions on the precipitation process. The experiments presented show the way in which particles of soluble salts commonly found in the environment can absorb water in the atmosphere and influence cloud formation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Pós-graduação em Agronomia (Horticultura) - FCA

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Issues related to association mining have received attention, especially the ones aiming to discover and facilitate the search for interesting patterns. A promising approach, in this context, is the application of clustering in the pre-processing step. In this paper, eleven metrics are proposed to provide an assessment procedure in order to support the evaluation of this kind of approach. To propose the metrics, a subjective evaluation was done. The metrics are important since they provide criteria to: (a) analyze the methodologies, (b) identify their positive and negative aspects, (c) carry out comparisons among them and, therefore, (d) help the users to select the most suitable solution for their problems. Besides, the metrics do the users think about aspects related to the problems and provide a flexible way to solve them. Some experiments were done in order to present how the metrics can be used and their usefulness.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Understanding the geographic and environmental characteristics of islands that affect aspects of biodiversity is a major theme in ecology (Begon et al. 2006; Krebs 2001) and biogeography (Cox and Moore 2000; Drakare et al. 2006; Lomolino et al. 2006). Such understanding has become particularly relevant over the past century because human activities on continents have fragmented natural landscapes, often creating islands of isolated habitat dispersed within a sea of land uses that include agriculture, forestry, and various degrees of urban and suburban development. The increasingly fragmented or islandlike structure of mainland habitats has critical ramifications to conservation biology, as it provides insights regarding the mechanisms leading to species persistence and loss. Consequently, the study of patterns and mechanisms associated with island biodiversity is of interest in its own right (Whittaker 1998; Williamson 1981), and may provide critical insights into mainland phenomena that otherwise could not be studied because of ethical, financial, or logistical considerations involved with the execution of large-scale manipulative experiments.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The design of a network is a solution to several engineering and science problems. Several network design problems are known to be NP-hard, and population-based metaheuristics like evolutionary algorithms (EAs) have been largely investigated for such problems. Such optimization methods simultaneously generate a large number of potential solutions to investigate the search space in breadth and, consequently, to avoid local optima. Obtaining a potential solution usually involves the construction and maintenance of several spanning trees, or more generally, spanning forests. To efficiently explore the search space, special data structures have been developed to provide operations that manipulate a set of spanning trees (population). For a tree with n nodes, the most efficient data structures available in the literature require time O(n) to generate a new spanning tree that modifies an existing one and to store the new solution. We propose a new data structure, called node-depth-degree representation (NDDR), and we demonstrate that using this encoding, generating a new spanning forest requires average time O(root n). Experiments with an EA based on NDDR applied to large-scale instances of the degree-constrained minimum spanning tree problem have shown that the implementation adds small constants and lower order terms to the theoretical bound.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Observations of cosmic rays arrival directions made with the Pierre Auger Observatory have previously provided evidence of anisotropy at the 99% CL using the correlation of ultra high energy cosmic rays (UHECRs) with objects drawn from the Veron-Cetty Veron catalog. In this paper we report on the use of three catalog independent methods to search for anisotropy. The 2pt-L, 2pt+ and 3pt methods, each giving a different measure of self-clustering in arrival directions, were tested on mock cosmic ray data sets to study the impacts of sample size and magnetic smearing on their results, accounting for both angular and energy resolutions. If the sources of UHECRs follow the same large scale structure as ordinary galaxies in the local Universe and if UHECRs are deflected no more than a few degrees, a study of mock maps suggests that these three method can efficiently respond to the resulting anisotropy with a P-value = 1.0% or smaller with data sets as few as 100 events. using data taken from January 1, 2004 to July 31, 2010 we examined the 20, 30, ... , 110 highest energy events with a corresponding minimum energy threshold of about 49.3 EeV. The minimum P-values found were 13.5% using the 2pt-L method, 1.0% using the 2pt+ method and 1.1% using the 3pt method for the highest 100 energy events. In view of the multiple (correlated) scans performed on the data set, these catalog-independent methods do not yield strong evidence of anisotropy in the highest energy cosmic rays.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The origin of cosmic rays at all energies is still uncertain. In this paper, we present and explore an astrophysical scenario to produce cosmic rays with energy ranging from below 10(15) to 3 x 10(20) eV. We show here that just our Galaxy and the radio galaxy Cen A, each with their own galactic cosmic-ray particles but with those from the radio galaxy pushed up in energy by a relativistic shock in the jet emanating from the active black hole, are sufficient to describe the most recent data in the PeV to near ZeV energy range. Data are available over this entire energy range from the KASCADE, KASCADE-Grande, and Pierre Auger Observatory experiments. The energy spectrum calculated here correctly reproduces the measured spectrum beyond the knee and, contrary to widely held expectations, no other extragalactic source population is required to explain the data even at energies far below the general cutoff expected at 6 x 10(19) eV, the Greisen-Zatsepin-Kuz'min turnoff due to interaction with the cosmological microwave background. We present several predictions for the source population, the cosmic-ray composition, and the propagation to Earth which can be tested in the near future.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The current cosmological dark sector (dark matter plus dark energy) is challenging our comprehension about the physical processes taking place in the Universe. Recently, some authors tried to falsify the basic underlying assumptions of such dark matterdark energy paradigm. In this Letter, we show that oversimplifications of the measurement process may produce false positives to any consistency test based on the globally homogeneous and isotropic ? cold dark matter (?CDM) model and its expansion history based on distance measurements. In particular, when local inhomogeneity effects due to clumped matter or voids are taken into account, an apparent violation of the basic assumptions (Copernican Principle) seems to be present. Conversely, the amplitude of the deviations also probes the degree of reliability underlying the phenomenological DyerRoeder procedure by confronting its predictions with the accuracy of the weak lensing approach. Finally, a new method is devised to reconstruct the effects of the inhomogeneities in a ?CDM model, and some suggestions of how to distinguish between clumpiness (or void) effects from different cosmologies are discussed.