972 resultados para iterated local search


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This proposal is to search, investigate practical experience in environmental education for the construction of Local Agenda 21, in the municipality of Maxaranguape-RN, attended that brought together various subject and collective social actors of civil society organizations, among them, the Center for Education and Advice Herbert de Souza - CEAHS (NGOs who serves on the council since 1999), associations of farmers and farmers in areas of settlements, teachers / as, groups of women and young people, entrepreneurs, public power, the German partner entities IBAMA. INCRA, BNB in the project of Agenda 21. They are members and participants, constituents of the Permanent Forum of Agenda 21, the main actor privileged in the search. As an object of study to identify the limits and scope of this practice, with regard to aspects of awareness / participation and awaken to an awareness of critical social subjects in the collective social and environmental perspective. The study seeks to investigate if this experience has allowed the individual and collective social subjects, understand and act in their daily life, as the changes in attitudes postures, and expand their interests to participate in various public spaces this intention, is considered the educational activities made with the principles of environmental education in the construction of Agenda 21 that have contributed in raising awareness / participation of social actors of the Permanent Forum of Agenda 21. While reference methodology, the research focuses on theoretical design Freireana with relevance on the dimensions of dialogue, critical thinking and the human dimension comprising the act as educational practice of freedom, the prospect of human emancipation and social transformation of reality, and bring other thinkers as, Carvalho (2004), Trigueiro (2003), Days (2004), among others. The investigation of this practice points to the subject of education, which ECOCIENCIA to install the Agenda 21 and its effect on demand under municipal, German, providing a change of attitudes and postures and certainly, generating a new look and act in the world, broadening their interests and desires of inserting themselves, to participate in public spheres, particularly in establishing relations with dialogical criticality with the authorities and face the demands socio-environmental locations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The question of participation has been debated in Brazil since the 1980 decade in search a better way to take care of poulation s demand. More specificaly after the democratic open (1985) begins to be thought ways to make population participates of decisions related to alocation of public resources. The characteristic of participates actualy doesn t exist, population to be carried through is, at top, consulted, and the fact population participates stays restrict to some technics interests at the projects, mainly of public politics of local development. Observe that this implementation happens through a process and that has its limits (pass) that could be surpassed through strategies made to that. This dissertation shows results of a research about participative practices in city of Serrinha between 1997 and 2004, showing through a study of the case of Serrinha what was the process used to carry through these pratices in a moment and local considered model of this application. The analyses were developed through a model of research elaborated by the author based on large literature respects the ideal process to implant a participative public politics. The present research had a qualitative boarding, being explorative and descritive nature. The researcher (author of this dissertation) carried through all the research phases, including the transcriptions of interviews that were recorded with a digital voice recorder. Before the analysis of these data was verified that despite the public manager (former-mayor) had had a real interest in implant a process of local development in city, he was not able to forsee the correct process to do it. Two high faults were made. The first was the intention to have as tool a development plan, what locked up to make this plan was the booster of supossed participative pratice and no the ideal model that would be a plan generate by popular initiative. The second one was absence of a critical education project for the population that should be the fisrt step to carry through a politc like that

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The house work is a reality for girls of humble class and one of the most found forms of work among adolescent workers. Moreover, it is a mean of work which reproduces poverty and gender relations within the society. The purpose of this essay in to understand the house work in the life of adolescent workers, emphasizing the meaning produced by these teenagers concerning the job they perform. In order to achieve such goal, questionnaires were applied to 332 adolescents, under 18 years old from public schools (from EJA-supletivo) in Natal, with the purpose of mapping the registration of this activity among young students. Next, 14 adolescents were interviewed in order to recognize the meaning of this work and its repercussions over the teenagery, such as school education, socialization, relations with employers and adolescent s self-image. Later we have noticed most workers among the students from public schools are housemaids. Furthermore, this work is used as a form of social ascension and it contributes for the search for better opportunities in the state capital for adolescents who leave the countryside trying to agree to education and remuneration. This work plays an important role, which is to reproduce gender relations, as a woman works to maintain the private space as a female space and maintains the man out of this relation. Besides it reproduces class relations, ethny and generation conflicts, in which the employer replaces the control the parents have in the adolescent s life. Summing up, this study about house work have negative aspects, related to exploitation, humiliation and mistreat, as well as positive ones, for it permits the adolescent to improve his life conditions. The most important thing is to look for a mean of work in which human and workers rights are respected

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Based on the literature data from HT-29 cell monolayers, we develop a model for its growth, analogous to an epidemic model, mixing local and global interactions. First, we propose and solve a deterministic equation for the progress of these colonies. Thus, we add a stochastic (local) interaction and simulate the evolution of an Eden-like aggregate by using dynamical Monte Carlo methods. The growth curves of both deterministic and stochastic models are in excellent agreement with the experimental observations. The waiting times distributions, generated via our stochastic model, allowed us to analyze the role of mesoscopic events. We obtain log-normal distributions in the initial stages of the growth and Gaussians at long times. We interpret these outcomes in the light of cellular division events: in the early stages, the phenomena are dependent each other in a multiplicative geometric-based process, and they are independent at long times. We conclude that the main ingredients for a good minimalist model of tumor growth, at mesoscopic level, are intrinsic cooperative mechanisms and competitive search for space. © 2013 Elsevier Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

O método de empilhamento sísmico por Superfície de Reflexão Comum (ou empilhamento SRC) produz a simulação de seções com afastamento nulo (NA) a partir dos dados de cobertura múltipla. Para meios 2D, o operador de empilhamento SRC depende de três parâmetros que são: o ângulo de emergência do raio central com fonte-receptor nulo (β0), o raio de curvatura da onda ponto de incidência normal (RNIP) e o raio de curvatura da onda normal (RN). O problema crucial para a implementação do método de empilhamento SRC consiste na determinação, a partir dos dados sísmicos, dos três parâmetros ótimos associados a cada ponto de amostragem da seção AN a ser simulada. No presente trabalho foi desenvolvido uma nova sequência de processamento para a simulação de seções AN por meio do método de empilhamento SRC. Neste novo algoritmo, a determinação dos três parâmetros ótimos que definem o operador de empilhamento SRC é realizada em três etapas: na primeira etapa são estimados dois parâmetros (β°0 e R°NIP) por meio de uma busca global bidimensional nos dados de cobertura múltipla. Na segunda etapa é usado o valor de β°0 estimado para determinar-se o terceiro parâmetro (R°N) através de uma busca global unidimensional na seção AN resultante da primeira etapa. Em ambas etapas as buscas globais são realizadas aplicando o método de otimização Simulated Annealing (SA). Na terceira etapa são determinados os três parâmetros finais (β0, RNIP e RN) através uma busca local tridimensional aplicando o método de otimização Variable Metric (VM) nos dados de cobertura múltipla. Nesta última etapa é usado o trio de parâmetros (β°0, R°NIP, R°N) estimado nas duas etapas anteriores como aproximação inicial. Com o propósito de simular corretamente os eventos com mergulhos conflitantes, este novo algoritmo prevê a determinação de dois trios de parâmetros associados a pontos de amostragem da seção AN onde há intersecção de eventos. Em outras palavras, nos pontos da seção AN onde dois eventos sísmicos se cruzam são determinados dois trios de parâmetros SRC, os quais serão usados conjuntamente na simulação dos eventos com mergulhos conflitantes. Para avaliar a precisão e eficiência do novo algoritmo, este foi aplicado em dados sintéticos de dois modelos: um com interfaces contínuas e outro com uma interface descontinua. As seções AN simuladas têm elevada razão sinal-ruído e mostram uma clara definição dos eventos refletidos e difratados. A comparação das seções AN simuladas com as suas similares obtidas por modelamento direto mostra uma correta simulação de reflexões e difrações. Além disso, a comparação dos valores dos três parâmetros otimizados com os seus correspondentes valores exatos calculados por modelamento direto revela também um alto grau de precisão. Usando a aproximação hiperbólica dos tempos de trânsito, porém sob a condição de RNIP = RN, foi desenvolvido um novo algoritmo para a simulação de seções AN contendo predominantemente campos de ondas difratados. De forma similar ao algoritmo de empilhamento SRC, este algoritmo denominado empilhamento por Superfícies de Difração Comum (SDC) também usa os métodos de otimização SA e VM para determinar a dupla de parâmetros ótimos (β0, RNIP) que definem o melhor operador de empilhamento SDC. Na primeira etapa utiliza-se o método de otimização SA para determinar os parâmetros iniciais β°0 e R°NIP usando o operador de empilhamento com grande abertura. Na segunda etapa, usando os valores estimados de β°0 e R°NIP, são melhorados as estimativas do parâmetro RNIP por meio da aplicação do algoritmo VM na seção AN resultante da primeira etapa. Na terceira etapa são determinados os melhores valores de β°0 e R°NIP por meio da aplicação do algoritmo VM nos dados de cobertura múltipla. Vale salientar que a aparente repetição de processos tem como efeito a atenuação progressiva dos eventos refletidos. A aplicação do algoritmo de empilhamento SDC em dados sintéticos contendo campos de ondas refletidos e difratados, produz como resultado principal uma seção AN simulada contendo eventos difratados claramente definidos. Como uma aplicação direta deste resultado na interpretação de dados sísmicos, a migração pós-empilhamento em profundidade da seção AN simulada produz uma seção com a localização correta dos pontos difratores associados às descontinuidades do modelo.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Biologia Sistêmica visa a compreensão da vida através de modelos integrativos que enfatizem as interações entre os diferentes agentes biológicos. O objetivo é buscar por leis universais, não nas partes componentes dos sistemas mas sim nos padrões de interação dos elementos constituintes. As redes complexas biológicas são uma poderosa abstração matemática que permite a representação de grandes volumes de dados e a posterior formulação de hipóteses biológicas. Nesta tese apresentamos as redes biológicas integradas que incluem interações oriundas do metabolismo, interação física de proteínas e regulação. Discutimos sua construção e ferramentas para sua análise global e local. Apresentamos também resultados do uso de ferramentas de aprendizado de máquina que nos permitem compreender a relação entre propriedades topológicas e a essencialidade gênica e a previsão de genes mórbidos e alvos para drogas em humanos

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pós-graduação em Engenharia Elétrica - FEIS

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The local productive arrangements (LPA) are organizational mechanisms that enable the continuous performance of small businesses in production processes and business management models. To promote continuous improvement is essential that companies make decisions based on data that reflects business’ performance (performance measurement) and promote the cluster performance. This way, the aim of this article is to describe the performance measurement process that will support the corporate management of Micro and Small enterprises (SME) of a Local Productive Arrangement in Maringá, State of Paraná, Brazil. To develop this paper, the bibliographic search and action research methods were used. The field research was developed from the cooperation project (PROJVEST) conducted at the LPA, which goal is to deploy improvement actions in project’s participating companies. Metrics and performance indicators constructed from the diagnosis in the areas of Production, Quality and Ergonomics in companies will be presented. Among the main results, can be pointed out that the performance management of LPA is promoting the introduction of corporate management practices in SMEs, stimulating business cooperation, the continuous innovation in manufacturing processes and product quality and business processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Observations of cosmic rays arrival directions made with the Pierre Auger Observatory have previously provided evidence of anisotropy at the 99% CL using the correlation of ultra high energy cosmic rays (UHECRs) with objects drawn from the Veron-Cetty Veron catalog. In this paper we report on the use of three catalog independent methods to search for anisotropy. The 2pt-L, 2pt+ and 3pt methods, each giving a different measure of self-clustering in arrival directions, were tested on mock cosmic ray data sets to study the impacts of sample size and magnetic smearing on their results, accounting for both angular and energy resolutions. If the sources of UHECRs follow the same large scale structure as ordinary galaxies in the local Universe and if UHECRs are deflected no more than a few degrees, a study of mock maps suggests that these three method can efficiently respond to the resulting anisotropy with a P-value = 1.0% or smaller with data sets as few as 100 events. using data taken from January 1, 2004 to July 31, 2010 we examined the 20, 30, ... , 110 highest energy events with a corresponding minimum energy threshold of about 49.3 EeV. The minimum P-values found were 13.5% using the 2pt-L method, 1.0% using the 2pt+ method and 1.1% using the 3pt method for the highest 100 energy events. In view of the multiple (correlated) scans performed on the data set, these catalog-independent methods do not yield strong evidence of anisotropy in the highest energy cosmic rays.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Decision tree induction algorithms represent one of the most popular techniques for dealing with classification problems. However, traditional decision-tree induction algorithms implement a greedy approach for node splitting that is inherently susceptible to local optima convergence. Evolutionary algorithms can avoid the problems associated with a greedy search and have been successfully employed to the induction of decision trees. Previously, we proposed a lexicographic multi-objective genetic algorithm for decision-tree induction, named LEGAL-Tree. In this work, we propose extending this approach substantially, particularly w.r.t. two important evolutionary aspects: the initialization of the population and the fitness function. We carry out a comprehensive set of experiments to validate our extended algorithm. The experimental results suggest that it is able to outperform both traditional algorithms for decision-tree induction and another evolutionary algorithm in a variety of application domains.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The central objective of research in Information Retrieval (IR) is to discover new techniques to retrieve relevant information in order to satisfy an Information Need. The Information Need is satisfied when relevant information can be provided to the user. In IR, relevance is a fundamental concept which has changed over time, from popular to personal, i.e., what was considered relevant before was information for the whole population, but what is considered relevant now is specific information for each user. Hence, there is a need to connect the behavior of the system to the condition of a particular person and his social context; thereby an interdisciplinary sector called Human-Centered Computing was born. For the modern search engine, the information extracted for the individual user is crucial. According to the Personalized Search (PS), two different techniques are necessary to personalize a search: contextualization (interconnected conditions that occur in an activity), and individualization (characteristics that distinguish an individual). This movement of focus to the individual's need undermines the rigid linearity of the classical model overtaken the ``berry picking'' model which explains that the terms change thanks to the informational feedback received from the search activity introducing the concept of evolution of search terms. The development of Information Foraging theory, which observed the correlations between animal foraging and human information foraging, also contributed to this transformation through attempts to optimize the cost-benefit ratio. This thesis arose from the need to satisfy human individuality when searching for information, and it develops a synergistic collaboration between the frontiers of technological innovation and the recent advances in IR. The search method developed exploits what is relevant for the user by changing radically the way in which an Information Need is expressed, because now it is expressed through the generation of the query and its own context. As a matter of fact the method was born under the pretense to improve the quality of search by rewriting the query based on the contexts automatically generated from a local knowledge base. Furthermore, the idea of optimizing each IR system has led to develop it as a middleware of interaction between the user and the IR system. Thereby the system has just two possible actions: rewriting the query, and reordering the result. Equivalent actions to the approach was described from the PS that generally exploits information derived from analysis of user behavior, while the proposed approach exploits knowledge provided by the user. The thesis went further to generate a novel method for an assessment procedure, according to the "Cranfield paradigm", in order to evaluate this type of IR systems. The results achieved are interesting considering both the effectiveness achieved and the innovative approach undertaken together with the several applications inspired using a local knowledge base.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The promise of search-driven development is that developers will save time and resources by reusing external code in their local projects. To efficiently integrate this code, users must be able to trust it, thus trustability of code search results is just as important as their relevance. In this paper, we introduce a trustability metric to help users assess the quality of code search results and therefore ease the cost-benefit analysis they undertake trying to find suitable integration candidates. The proposed trustability metric incorporates both user votes and cross-project activity of developers to calculate a "karma" value for each developer. Through the karma value of all its developers a project is ranked on a trustability scale. We present JBENDER, a proof-of-concept code search engine which implements our trustability metric and we discuss preliminary results from an evaluation of the prototype.