34 resultados para Statistical Behavior


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We address the problem of coordinating two non-holonomic mobile robots that move in formation while transporting a long payload. A competitive dynamics is introduced that gradually controls the activation and deactivation of individual behaviors. This process introduces (asymmetrical) hysteresis during behavioral switching. As a result behavioral oscillations, due to noisy information, are eliminated. Results in indoor environments show that if parameter values are chosen within reasonable ranges then, in spite of noise in the robots communi- cation and sensors, the overall robotic system works quite well even in cluttered environments. The robots overt behavior is stable and smooth.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Swarm Intelligence (SI) is the property of a system whereby the collective behaviors of (unsophisticated) agents interacting locally with their environment cause coherent functional global patterns to emerge. Particle swarm optimization (PSO) is a form of SI, and a population-based search algorithm that is initialized with a population of random solutions, called particles. These particles are flying through hyperspace and have two essential reasoning capabilities: their memory of their own best position and knowledge of the swarm's best position. In a PSO scheme each particle flies through the search space with a velocity that is adjusted dynamically according with its historical behavior. Therefore, the particles have a tendency to fly towards the best search area along the search process. This work proposes a PSO based algorithm for logic circuit synthesis. The results show the statistical characteristics of this algorithm with respect to number of generations required to achieve the solutions. It is also presented a comparison with other two Evolutionary Algorithms, namely Genetic and Memetic Algorithms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As Leis de Potência, LP, (Power Laws, em inglês), Leis de Pareto ou Leis de Zipf são distribuições estatísticas, com inúmeras aplicações práticas, em sistemas naturais e artificiais. Alguns exemplos são a variação dos rendimentos pessoais ou de empresas, a ocorrência de palavras em textos, as repetições de sons ou conjuntos de sons em composições musicais, o número de vítimas em guerras ou outros cataclismos, a magnitude de tremores de terra, o número de vendas de livros ou CD’s na internet, o número de sítios mais acedidos na Internet, entre muitos outros. Vilfredo Pareto (1897-1906) afirma, no manual de economia política “Cours d’Economie Politique”, que grande parte da economia mundial segue uma determinada distribuição, em que 20% da população reúne 80% da riqueza total do país, estando, assim uma pequena fração da sociedade a controlar a maior fatia do dinheiro. Isto resume o comportamento de uma variável que segue uma distribuição de Pareto (ou Lei de Potência). Neste trabalho pretende-se estudar em pormenor a aplicação das leis de potência a fenómenos da internet, como sendo o número de sítios mais visitados, o número de links existentes em determinado sítio, a distribuição de nós numa rede da internet, o número livros vendidos e as vendas em leilões online. Os resultados obtidos permitem-nos concluir que todos os dados estudados são bem aproximados, numa escala logarítmica, por uma reta com declive negativo, seguindo, assim, uma distribuição de Pareto. O desenvolvimento e crescimento da Web, tem proporcionado um aumento do número dos utilizadores, conteúdos e dos sítios. Grande parte dos exemplos presentes neste trabalho serão alvo de novos estudos e de novas conclusões. O fato da internet ter um papel preponderante nas sociedades modernas, faz com que esteja em constante evolução e cada vez mais seja possível apresentar fenómenos na internet associados Lei de Potência.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Beyond the classical statistical approaches (determination of basic statistics, regression analysis, ANOVA, etc.) a new set of applications of different statistical techniques has increasingly gained relevance in the analysis, processing and interpretation of data concerning the characteristics of forest soils. This is possible to be seen in some of the recent publications in the context of Multivariate Statistics. These new methods require additional care that is not always included or refered in some approaches. In the particular case of geostatistical data applications it is necessary, besides to geo-reference all the data acquisition, to collect the samples in regular grids and in sufficient quantity so that the variograms can reflect the spatial distribution of soil properties in a representative manner. In the case of the great majority of Multivariate Statistics techniques (Principal Component Analysis, Correspondence Analysis, Cluster Analysis, etc.) despite the fact they do not require in most cases the assumption of normal distribution, they however need a proper and rigorous strategy for its utilization. In this work, some reflections about these methodologies and, in particular, about the main constraints that often occur during the information collecting process and about the various linking possibilities of these different techniques will be presented. At the end, illustrations of some particular cases of the applications of these statistical methods will also be presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Prepared for presentation at the Portuguese Finance Network International Conference 2014, Vilamoura, Portugal, June 18-20

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mathematical models and statistical analysis are key instruments in soil science scientific research as they can describe and/or predict the current state of a soil system. These tools allow us to explore the behavior of soil related processes and properties as well as to generate new hypotheses for future experimentation. A good model and analysis of soil properties variations, that permit us to extract suitable conclusions and estimating spatially correlated variables at unsampled locations, is clearly dependent on the amount and quality of data and of the robustness techniques and estimators. On the other hand, the quality of data is obviously dependent from a competent data collection procedure and from a capable laboratory analytical work. Following the standard soil sampling protocols available, soil samples should be collected according to key points such as a convenient spatial scale, landscape homogeneity (or non-homogeneity), land color, soil texture, land slope, land solar exposition. Obtaining good quality data from forest soils is predictably expensive as it is labor intensive and demands many manpower and equipment both in field work and in laboratory analysis. Also, the sampling collection scheme that should be used on a data collection procedure in forest field is not simple to design as the sampling strategies chosen are strongly dependent on soil taxonomy. In fact, a sampling grid will not be able to be followed if rocks at the predicted collecting depth are found, or no soil at all is found, or large trees bar the soil collection. Considering this, a proficient design of a soil data sampling campaign in forest field is not always a simple process and sometimes represents a truly huge challenge. In this work, we present some difficulties that have occurred during two experiments on forest soil that were conducted in order to study the spatial variation of some soil physical-chemical properties. Two different sampling protocols were considered for monitoring two types of forest soils located in NW Portugal: umbric regosol and lithosol. Two different equipments for sampling collection were also used: a manual auger and a shovel. Both scenarios were analyzed and the results achieved have allowed us to consider that monitoring forest soil in order to do some mathematical and statistical investigations needs a sampling procedure to data collection compatible to established protocols but a pre-defined grid assumption often fail when the variability of the soil property is not uniform in space. In this case, sampling grid should be conveniently adapted from one part of the landscape to another and this fact should be taken into consideration of a mathematical procedure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the current context of serious climate changes, where the increase of the frequency of some extreme events occurrence can enhance the rate of periods prone to high intensity forest fires, the National Forest Authority often implements, in several Portuguese forest areas, a regular set of measures in order to control the amount of fuel mass availability (PNDFCI, 2008). In the present work we’ll present a preliminary analysis concerning the assessment of the consequences given by the implementation of prescribed fire measures to control the amount of fuel mass in soil recovery, in particular in terms of its water retention capacity, its organic matter content, pH and content of iron. This work is included in a larger study (Meira-Castro, 2009(a); Meira-Castro, 2009(b)). According to the established praxis on the data collection, embodied in multidimensional matrices of n columns (variables in analysis) by p lines (sampled areas at different depths), and also considering the quantitative data nature present in this study, we’ve chosen a methodological approach that considers the multivariate statistical analysis, in particular, the Principal Component Analysis (PCA ) (Góis, 2004). The experiments were carried out in a soil cover over a natural site of Andaluzitic schist, in Gramelas, Caminha, NW Portugal, who was able to maintain itself intact from prescribed burnings from four years and was submit to prescribed fire in March 2008. The soils samples were collected from five different plots at six different time periods. The methodological option that was adopted have allowed us to identify the most relevant relational structures inside the n variables, the p samples and in two sets at the same time (Garcia-Pereira, 1990). Consequently, and in addition to the traditional outputs produced from the PCA, we have analyzed the influence of both sampling depths and geomorphological environments in the behavior of all variables involved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study exotic patterns appearing in a network of coupled Chen oscillators. Namely, we consider a network of two rings coupled through a “buffer” cell, with Z3×Z5 symmetry group. Numerical simulations of the network reveal steady states, rotating waves in one ring and quasiperiodic behavior in the other, and chaotic states in the two rings, to name a few. The different patterns seem to arise through a sequence of Hopf bifurcations, period-doubling, and halving-period bifurcations. The network architecture seems to explain certain observed features, such as equilibria and the rotating waves, whereas the properties of the chaotic oscillator may explain others, such as the quasiperiodic and chaotic states. We use XPPAUT and MATLAB to compute numerically the relevant states.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mestrado em Engenharia Eletrotécnica e de Computadores - Área de Especialização de Sistemas e Planeamento Industrial

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nesta dissertação aborda-se a aplicação de Leis de Potência (LPs), também designadas de Leis de Pareto ou Leis de Zipf, a dados económicos. As LPs são distribuições estatísticas amplamente usadas na compreensão de sistemas naturais e artificiais. O aparecimento das LPs deve-se a Vilfredo Pareto que, no século XIX, publicou o manual de economia política,“Cours d’Economie Politique”. Nesse manual refere que grande parte da economia mundial segue uma LP, em que 20% da população reúne 80% da riqueza do país. Esta propriedade carateriza uma variável que segue uma distribuição de Pareto (ou LP). Desde então, as LPs foram aplicadas a outros fenómenos, nomeadamente a ocorrência de palavras em textos, os sobrenomes das pessoas, a variação dos rendimentos pessoais ou de empresas, o número de vítimas de inundações ou tremores de terra, os acessos a sítios da internet, etc. Neste trabalho, é estudado um conjunto de dados relativos às fortunas particulares ou coletivas de pessoas ou organizações. Mais concretamente são analisados dados recolhidos sobre as fortunas das mulheres mais ricas do mundo, dos homens mais ricos no ramo da tecnologia, das famílias mais ricas, das 20 mulheres mais ricas da América, dos 400 homens mais ricos da América, dos homens mais ricos do mundo, dos estabelecimentos mais ricos do mundo, das empresas mais ricas do mundo e dos países mais ricos do mundo, bem como o valor de algumas empresas no mercado de ações. Os resultados obtidos revelam uma boa aproximação de parte desses dados a uma LP simples e uma boa aproximação pelos restantes dados a uma LP dupla. Observa-se, assim, diferenciação na forma de crescimento das fortunas nos diferentes casos estudados. Como trabalho futuro, procurar-se-á analisar estes e outros dados, utilizando outras distribuições estatísticas, como a exponencial ou a lognormal, que possuem comportamentos semelhantes à LP, com o intuito de serem comparados os resultados. Um outro aspeto interessante será o de encontrar a explicação analítica para as vantagens da aproximação de dados económicos por uma LP simples vs por uma LP dupla.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nos últimos anos a indústria de semicondutores, nomeadamente a produção de memórias, tem sofrido uma grande evolução. A necessidade de baixar custos de produção, assim como de produzir sistemas mais complexos e com maior capacidade, levou à criação da tecnologia WLP (Wafer Level Packaging). Esta tecnologia permite a produção de sistemas mais pequenos, simplificar o fluxo do processo e providenciar uma redução significativa do custo final do produto. A WLP é uma tecnologia de encapsulamento de circuitos integrados quando ainda fazem parte de wafers (bolachas de silício), em contraste com o método tradicional em que os sistemas são individualizados previamente antes de serem encapsulados. Com o desenvolvimento desta tecnologia, surgiu a necessidade de melhor compreender o comportamento mecânico do mold compound (MC - polímero encapsulante) mais especificamente do warpage (empeno) de wafers moldadas. O warpage é uma característica deste produto e deve-se à diferença do coeficiente de expansão térmica entre o silício e o mold compound. Este problema é observável no produto através do arqueamento das wafers moldadas. O warpage de wafers moldadas tem grande impacto na manufatura. Dependendo da quantidade e orientação do warpage, o transporte, manipulação, bem como, a processamento das wafers podem tornar-se complicados ou mesmo impossíveis, o que se traduz numa redução de volume de produção e diminuição da qualidade do produto. Esta dissertação foi desenvolvida na Nanium S.A., empresa portuguesa líder mundial na tecnologia de WLP em wafers de 300mm e aborda a utilização da metodologia Taguchi, no estudo da variabilidade do processo de debond para o produto X. A escolha do processo e produto baseou-se numa análise estatística da variação e do impacto do warpage ao longo doprocesso produtivo. A metodologia Taguchi é uma metodologia de controlo de qualidade e permite uma aproximação sistemática num dado processo, combinando gráficos de controlo, controlo do processo/produto, e desenho do processo para alcançar um processo robusto. Os resultados deste método e a sua correta implementação permitem obter poupanças significativas nos processos com um impacto financeiro significativo. A realização deste projeto permitiu estudar e quantificar o warpage ao longo da linha de produção e minorar o impacto desta característica no processo de debond. Este projecto permitiu ainda a discussão e o alinhamento entre as diferentes áreas de produção no que toca ao controlo e a melhoria de processos. Conseguiu–se demonstrar que o método Taguchi é um método eficiente no que toca ao estudo da variabilidade de um processo e otimização de parâmetros. A sua aplicação ao processo de debond permitiu melhorar ou a fiabilidade do processo em termos de garantia da qualidade do produto, como ao nível do aumento de produção.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Power laws, also known as Pareto-like laws or Zipf-like laws, are commonly used to explain a variety of real world distinct phenomena, often described merely by the produced signals. In this paper, we study twelve cases, namely worldwide technological accidents, the annual revenue of America׳s largest private companies, the number of inhabitants in America׳s largest cities, the magnitude of earthquakes with minimum moment magnitude equal to 4, the total burned area in forest fires occurred in Portugal, the net worth of the richer people in America, the frequency of occurrence of words in the novel Ulysses, by James Joyce, the total number of deaths in worldwide terrorist attacks, the number of linking root domains of the top internet domains, the number of linking root domains of the top internet pages, the total number of human victims of tornadoes occurred in the U.S., and the number of inhabitants in the 60 most populated countries. The results demonstrate the emergence of statistical characteristics, very close to a power law behavior. Furthermore, the parametric characterization reveals complex relationships present at higher level of description.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this paper is to present the main Portuguese results from a multi-national study on reading format preferences and behaviors from undergraduate students from Polytechnic Institute of Porto (Portugal). For this purpose we apply an adaptation of the Academic Reading Questionnaire previously created by Mizrachi (2014). This survey instrument has 14 Likert-style statements regarding the format influence in the students reading behavior, including aspects such as ability to remember, feelings about access convenience, active engagement with the text by highlighting and annotating, and ability to review and concentrate on the text. The importance of the language and dimension of the text to determine the preference format is also inquired. Students are also asked about the electronic device they use to read digital documents. Finally, some demographic and academic data were gathered. The analysis of the results will be contextualized on a review of the literature concerning youngsters reading format preferences. The format (digital or print) in which a text is displayed and read can impact comprehension, which is an important information literacy skill. This is a quite relevant issue for class readings in academic context because it impacts learning. On the other hand, students preferences on reading formats will influence the use of library services. However, literature is not unanimous on this subject. Woody, Daniel and Baker (2010) concluded that the experience of reading is not the same in electronic or print context and that students prefer print books than e-books. This thesis is reinforced by Ji, Michaels and Waterman (2014) which report that among 101 undergraduates the large majority self-reported to read and learn more when they use printed format despite the fact that they prefer electronically supplied readings instead of those supplied in printed form. On the other side, Rockinson-Szapkiw, et al (2013) conducted a study were they demonstrate that e-textbook is as effective for learning as the traditional textbook and that students who choose e-textbook had significantly higher perceived learning than students who chose to use print textbooks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Presented at 23rd International Conference on Real-Time Networks and Systems (RTNS 2015). 4 to 6, Nov, 2015, Main Track. Lille, France.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One parameter that influences the adhesively bonded joints performance is the adhesive layer thickness. Hence, its effect has to be investigated experimentally and should be taken into consideration in the design of adhesive joints. Most of the results from literature are for typical structural epoxy adhesives which are generally formulated to perform in thin sections. However, polyurethane adhesives are designed to perform in thicker sections and might have a different behavior as a function of adhesive thickness. In this study, the effect of adhesive thickness on the mechanical behavior of a structural polyurethane adhesive was investigated. The mode I fracture toughness of the adhesive was measured using double-cantilever beam (DCB) tests with various thicknesses of the adhesive layer ranging from 0.2 to 2 mm. In addition, single lap joints (SLJs) were fabricated and tested to assess the influence of adhesive thickness on the lap-shear strength of the adhesive. An increasing fracture toughness with increasing adhesive thickness was found. The lap-shear strength decreases as the adhesive layer gets thicker, but in contrast to joints with brittle adhesives the decrease trend was less pronounced.