935 resultados para Local Search


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Based on the literature data from HT-29 cell monolayers, we develop a model for its growth, analogous to an epidemic model, mixing local and global interactions. First, we propose and solve a deterministic equation for the progress of these colonies. Thus, we add a stochastic (local) interaction and simulate the evolution of an Eden-like aggregate by using dynamical Monte Carlo methods. The growth curves of both deterministic and stochastic models are in excellent agreement with the experimental observations. The waiting times distributions, generated via our stochastic model, allowed us to analyze the role of mesoscopic events. We obtain log-normal distributions in the initial stages of the growth and Gaussians at long times. We interpret these outcomes in the light of cellular division events: in the early stages, the phenomena are dependent each other in a multiplicative geometric-based process, and they are independent at long times. We conclude that the main ingredients for a good minimalist model of tumor growth, at mesoscopic level, are intrinsic cooperative mechanisms and competitive search for space. © 2013 Elsevier Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

O método de empilhamento sísmico por Superfície de Reflexão Comum (ou empilhamento SRC) produz a simulação de seções com afastamento nulo (NA) a partir dos dados de cobertura múltipla. Para meios 2D, o operador de empilhamento SRC depende de três parâmetros que são: o ângulo de emergência do raio central com fonte-receptor nulo (β0), o raio de curvatura da onda ponto de incidência normal (RNIP) e o raio de curvatura da onda normal (RN). O problema crucial para a implementação do método de empilhamento SRC consiste na determinação, a partir dos dados sísmicos, dos três parâmetros ótimos associados a cada ponto de amostragem da seção AN a ser simulada. No presente trabalho foi desenvolvido uma nova sequência de processamento para a simulação de seções AN por meio do método de empilhamento SRC. Neste novo algoritmo, a determinação dos três parâmetros ótimos que definem o operador de empilhamento SRC é realizada em três etapas: na primeira etapa são estimados dois parâmetros (β°0 e R°NIP) por meio de uma busca global bidimensional nos dados de cobertura múltipla. Na segunda etapa é usado o valor de β°0 estimado para determinar-se o terceiro parâmetro (R°N) através de uma busca global unidimensional na seção AN resultante da primeira etapa. Em ambas etapas as buscas globais são realizadas aplicando o método de otimização Simulated Annealing (SA). Na terceira etapa são determinados os três parâmetros finais (β0, RNIP e RN) através uma busca local tridimensional aplicando o método de otimização Variable Metric (VM) nos dados de cobertura múltipla. Nesta última etapa é usado o trio de parâmetros (β°0, R°NIP, R°N) estimado nas duas etapas anteriores como aproximação inicial. Com o propósito de simular corretamente os eventos com mergulhos conflitantes, este novo algoritmo prevê a determinação de dois trios de parâmetros associados a pontos de amostragem da seção AN onde há intersecção de eventos. Em outras palavras, nos pontos da seção AN onde dois eventos sísmicos se cruzam são determinados dois trios de parâmetros SRC, os quais serão usados conjuntamente na simulação dos eventos com mergulhos conflitantes. Para avaliar a precisão e eficiência do novo algoritmo, este foi aplicado em dados sintéticos de dois modelos: um com interfaces contínuas e outro com uma interface descontinua. As seções AN simuladas têm elevada razão sinal-ruído e mostram uma clara definição dos eventos refletidos e difratados. A comparação das seções AN simuladas com as suas similares obtidas por modelamento direto mostra uma correta simulação de reflexões e difrações. Além disso, a comparação dos valores dos três parâmetros otimizados com os seus correspondentes valores exatos calculados por modelamento direto revela também um alto grau de precisão. Usando a aproximação hiperbólica dos tempos de trânsito, porém sob a condição de RNIP = RN, foi desenvolvido um novo algoritmo para a simulação de seções AN contendo predominantemente campos de ondas difratados. De forma similar ao algoritmo de empilhamento SRC, este algoritmo denominado empilhamento por Superfícies de Difração Comum (SDC) também usa os métodos de otimização SA e VM para determinar a dupla de parâmetros ótimos (β0, RNIP) que definem o melhor operador de empilhamento SDC. Na primeira etapa utiliza-se o método de otimização SA para determinar os parâmetros iniciais β°0 e R°NIP usando o operador de empilhamento com grande abertura. Na segunda etapa, usando os valores estimados de β°0 e R°NIP, são melhorados as estimativas do parâmetro RNIP por meio da aplicação do algoritmo VM na seção AN resultante da primeira etapa. Na terceira etapa são determinados os melhores valores de β°0 e R°NIP por meio da aplicação do algoritmo VM nos dados de cobertura múltipla. Vale salientar que a aparente repetição de processos tem como efeito a atenuação progressiva dos eventos refletidos. A aplicação do algoritmo de empilhamento SDC em dados sintéticos contendo campos de ondas refletidos e difratados, produz como resultado principal uma seção AN simulada contendo eventos difratados claramente definidos. Como uma aplicação direta deste resultado na interpretação de dados sísmicos, a migração pós-empilhamento em profundidade da seção AN simulada produz uma seção com a localização correta dos pontos difratores associados às descontinuidades do modelo.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Biologia Sistêmica visa a compreensão da vida através de modelos integrativos que enfatizem as interações entre os diferentes agentes biológicos. O objetivo é buscar por leis universais, não nas partes componentes dos sistemas mas sim nos padrões de interação dos elementos constituintes. As redes complexas biológicas são uma poderosa abstração matemática que permite a representação de grandes volumes de dados e a posterior formulação de hipóteses biológicas. Nesta tese apresentamos as redes biológicas integradas que incluem interações oriundas do metabolismo, interação física de proteínas e regulação. Discutimos sua construção e ferramentas para sua análise global e local. Apresentamos também resultados do uso de ferramentas de aprendizado de máquina que nos permitem compreender a relação entre propriedades topológicas e a essencialidade gênica e a previsão de genes mórbidos e alvos para drogas em humanos

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pós-graduação em Engenharia Elétrica - FEIS

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The local productive arrangements (LPA) are organizational mechanisms that enable the continuous performance of small businesses in production processes and business management models. To promote continuous improvement is essential that companies make decisions based on data that reflects business’ performance (performance measurement) and promote the cluster performance. This way, the aim of this article is to describe the performance measurement process that will support the corporate management of Micro and Small enterprises (SME) of a Local Productive Arrangement in Maringá, State of Paraná, Brazil. To develop this paper, the bibliographic search and action research methods were used. The field research was developed from the cooperation project (PROJVEST) conducted at the LPA, which goal is to deploy improvement actions in project’s participating companies. Metrics and performance indicators constructed from the diagnosis in the areas of Production, Quality and Ergonomics in companies will be presented. Among the main results, can be pointed out that the performance management of LPA is promoting the introduction of corporate management practices in SMEs, stimulating business cooperation, the continuous innovation in manufacturing processes and product quality and business processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Observations of cosmic rays arrival directions made with the Pierre Auger Observatory have previously provided evidence of anisotropy at the 99% CL using the correlation of ultra high energy cosmic rays (UHECRs) with objects drawn from the Veron-Cetty Veron catalog. In this paper we report on the use of three catalog independent methods to search for anisotropy. The 2pt-L, 2pt+ and 3pt methods, each giving a different measure of self-clustering in arrival directions, were tested on mock cosmic ray data sets to study the impacts of sample size and magnetic smearing on their results, accounting for both angular and energy resolutions. If the sources of UHECRs follow the same large scale structure as ordinary galaxies in the local Universe and if UHECRs are deflected no more than a few degrees, a study of mock maps suggests that these three method can efficiently respond to the resulting anisotropy with a P-value = 1.0% or smaller with data sets as few as 100 events. using data taken from January 1, 2004 to July 31, 2010 we examined the 20, 30, ... , 110 highest energy events with a corresponding minimum energy threshold of about 49.3 EeV. The minimum P-values found were 13.5% using the 2pt-L method, 1.0% using the 2pt+ method and 1.1% using the 3pt method for the highest 100 energy events. In view of the multiple (correlated) scans performed on the data set, these catalog-independent methods do not yield strong evidence of anisotropy in the highest energy cosmic rays.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Decision tree induction algorithms represent one of the most popular techniques for dealing with classification problems. However, traditional decision-tree induction algorithms implement a greedy approach for node splitting that is inherently susceptible to local optima convergence. Evolutionary algorithms can avoid the problems associated with a greedy search and have been successfully employed to the induction of decision trees. Previously, we proposed a lexicographic multi-objective genetic algorithm for decision-tree induction, named LEGAL-Tree. In this work, we propose extending this approach substantially, particularly w.r.t. two important evolutionary aspects: the initialization of the population and the fitness function. We carry out a comprehensive set of experiments to validate our extended algorithm. The experimental results suggest that it is able to outperform both traditional algorithms for decision-tree induction and another evolutionary algorithm in a variety of application domains.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The central objective of research in Information Retrieval (IR) is to discover new techniques to retrieve relevant information in order to satisfy an Information Need. The Information Need is satisfied when relevant information can be provided to the user. In IR, relevance is a fundamental concept which has changed over time, from popular to personal, i.e., what was considered relevant before was information for the whole population, but what is considered relevant now is specific information for each user. Hence, there is a need to connect the behavior of the system to the condition of a particular person and his social context; thereby an interdisciplinary sector called Human-Centered Computing was born. For the modern search engine, the information extracted for the individual user is crucial. According to the Personalized Search (PS), two different techniques are necessary to personalize a search: contextualization (interconnected conditions that occur in an activity), and individualization (characteristics that distinguish an individual). This movement of focus to the individual's need undermines the rigid linearity of the classical model overtaken the ``berry picking'' model which explains that the terms change thanks to the informational feedback received from the search activity introducing the concept of evolution of search terms. The development of Information Foraging theory, which observed the correlations between animal foraging and human information foraging, also contributed to this transformation through attempts to optimize the cost-benefit ratio. This thesis arose from the need to satisfy human individuality when searching for information, and it develops a synergistic collaboration between the frontiers of technological innovation and the recent advances in IR. The search method developed exploits what is relevant for the user by changing radically the way in which an Information Need is expressed, because now it is expressed through the generation of the query and its own context. As a matter of fact the method was born under the pretense to improve the quality of search by rewriting the query based on the contexts automatically generated from a local knowledge base. Furthermore, the idea of optimizing each IR system has led to develop it as a middleware of interaction between the user and the IR system. Thereby the system has just two possible actions: rewriting the query, and reordering the result. Equivalent actions to the approach was described from the PS that generally exploits information derived from analysis of user behavior, while the proposed approach exploits knowledge provided by the user. The thesis went further to generate a novel method for an assessment procedure, according to the "Cranfield paradigm", in order to evaluate this type of IR systems. The results achieved are interesting considering both the effectiveness achieved and the innovative approach undertaken together with the several applications inspired using a local knowledge base.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The promise of search-driven development is that developers will save time and resources by reusing external code in their local projects. To efficiently integrate this code, users must be able to trust it, thus trustability of code search results is just as important as their relevance. In this paper, we introduce a trustability metric to help users assess the quality of code search results and therefore ease the cost-benefit analysis they undertake trying to find suitable integration candidates. The proposed trustability metric incorporates both user votes and cross-project activity of developers to calculate a "karma" value for each developer. Through the karma value of all its developers a project is ranked on a trustability scale. We present JBENDER, a proof-of-concept code search engine which implements our trustability metric and we discuss preliminary results from an evaluation of the prototype.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A search for new particles that decay into top quark pairs (t (t) over bar) is performed with the ATLAS experiment at the LHC using an integrated luminosity of 4.7 fb(-1) of proton-proton (pp) collision data collected at a center-of-mass energy root s = 7 TeV. In the t (t) over bar) -> WbWb decay, the lepton plus jets final state is used, where one W boson decays leptonically and the other hadronically. The t (t) over bar) system is reconstructed using both small-radius and large-radius jets, the latter being supplemented by a jet substructure analysis. A search for local excesses in the number of data events compared to the Standard Model expectation in the t (t) over bar) invariant mass spectrum is performed. No evidence for a t (t) over bar) resonance is found and 95% credibility-level limits on the production rate are determined for massive states predicted in two benchmark models. The upper limits on the cross section times branching ratio of a narrow Z' resonance range from 5.1 pb for a boson mass of 0.5 TeV to 0.03 pb for a mass of 3 TeV. A narrow leptophobic topcolor Z' resonance with a mass below 1.74 TeV is excluded. Limits are also derived for a broad color-octet resonance with m 15.3%. A Kaluza-Klein excitation of the gluon in a Randall-Sundrum model is excluded for masses below 2.07 TeV.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION The first ophthalmologic complication in conjunction with a dental anesthesia was reported in 1936. The objective of the present study was a detailed analysis of case reports about that topic. MATERIAL AND METHODS After conducting a literature search in PubMed this study analyzed 108 ophthalmologic complications following intraoral local anesthesia in 65 case reports with respect to patient-, anesthesia-, and complication- related factors. RESULTS The mean age of the patients was 33.8 years and females predominated (72.3%). The most commonly reported complication was diplopia (39.8%), mostly resulting from paralysis of the lateral rectus muscle. Other relatively frequent complications included ptosis (16.7%), mydriasis (14.8%) and amaurosis (13%). Ophthalmologic complications were mainly associated with block anesthesia of the inferior alveolar nerve (45.8%) or the posterior superior alveolar nerve (40.3%). Typically, the ophthalmologic complications in conjunction with intraoral local anesthesia had an immediate to short onset, and disappeared as the anesthesia subsided. DISCUSSION AND CONCLUSION The increased number of ophthalmologic complications after intraoral local anesthesia in females may suggest a gender effect. Double vision (diplopia) is the most frequently described complication, which is usually completely reversible like the other reported ophthalmologic complications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have searched for periodic variations of the electronic recoil event rate in the (2-6) keV energy range recorded between February 2011 and March 2012 with the XENON100 detector, adding up to 224.6 live days in total. Following a detailed study to establish the stability of the detector and its background contributions during this run, we performed an un-binned profile likelihood analysis to identify any periodicity up to 500 days. We find a global significance of less than 1 sigma for all periods suggesting no statistically significant modulation in the data. While the local significance for an annual modulation is 2.8 sigma, the analysis of a multiple-scatter control sample and the phase of the modulation disfavor a dark matter interpretation. The DAMA/LIBRA annual modulation interpreted as a dark matter signature with axial-vector coupling of WIMPs to electrons is excluded at 4.8 sigma.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Effects of localized personal networks on the choice of search methods are studied in this paper using evidence of displaced workers by establishment closure in Thailand Labor Force Survey, 2001. For the blocks/villages level, there is less significant evidence of local interactions between job-seekers and referrals in developing labor markets. The effects of localized personal networks do not play an important role in the probability of unemployed job-seekers seeking assistance from friends and relatives. Convincing evidence from the data supports the proposition that both self-selection of individual background-like professions and access to large markets determine the choice of job search method.