977 resultados para statistical techniques


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Individual-based models (IBMs) can simulate the actions of individual animals as they interact with one another and the landscape in which they live. When used in spatially-explicit landscapes IBMs can show how populations change over time in response to management actions. For instance, IBMs are being used to design strategies of conservation and of the exploitation of fisheries, and for assessing the effects on populations of major construction projects and of novel agricultural chemicals. In such real world contexts, it becomes especially important to build IBMs in a principled fashion, and to approach calibration and evaluation systematically. We argue that insights from physiological and behavioural ecology offer a recipe for building realistic models, and that Approximate Bayesian Computation (ABC) is a promising technique for the calibration and evaluation of IBMs. IBMs are constructed primarily from knowledge about individuals. In ecological applications the relevant knowledge is found in physiological and behavioural ecology, and we approach these from an evolutionary perspective by taking into account how physiological and behavioural processes contribute to life histories, and how those life histories evolve. Evolutionary life history theory shows that, other things being equal, organisms should grow to sexual maturity as fast as possible, and then reproduce as fast as possible, while minimising per capita death rate. Physiological and behavioural ecology are largely built on these principles together with the laws of conservation of matter and energy. To complete construction of an IBM information is also needed on the effects of competitors, conspecifics and food scarcity; the maximum rates of ingestion, growth and reproduction, and life-history parameters. Using this knowledge about physiological and behavioural processes provides a principled way to build IBMs, but model parameters vary between species and are often difficult to measure. A common solution is to manually compare model outputs with observations from real landscapes and so to obtain parameters which produce acceptable fits of model to data. However, this procedure can be convoluted and lead to over-calibrated and thus inflexible models. Many formal statistical techniques are unsuitable for use with IBMs, but we argue that ABC offers a potential way forward. It can be used to calibrate and compare complex stochastic models and to assess the uncertainty in their predictions. We describe methods used to implement ABC in an accessible way and illustrate them with examples and discussion of recent studies. Although much progress has been made, theoretical issues remain, and some of these are outlined and discussed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We present a new technique for obtaining model fittings to very long baseline interferometric images of astrophysical jets. The method minimizes a performance function proportional to the sum of the squared difference between the model and observed images. The model image is constructed by summing N(s) elliptical Gaussian sources characterized by six parameters: two-dimensional peak position, peak intensity, eccentricity, amplitude, and orientation angle of the major axis. We present results for the fitting of two main benchmark jets: the first constructed from three individual Gaussian sources, the second formed by five Gaussian sources. Both jets were analyzed by our cross-entropy technique in finite and infinite signal-to-noise regimes, the background noise chosen to mimic that found in interferometric radio maps. Those images were constructed to simulate most of the conditions encountered in interferometric images of active galactic nuclei. We show that the cross-entropy technique is capable of recovering the parameters of the sources with a similar accuracy to that obtained from the very traditional Astronomical Image Processing System Package task IMFIT when the image is relatively simple (e. g., few components). For more complex interferometric maps, our method displays superior performance in recovering the parameters of the jet components. Our methodology is also able to show quantitatively the number of individual components present in an image. An additional application of the cross-entropy technique to a real image of a BL Lac object is shown and discussed. Our results indicate that our cross-entropy model-fitting technique must be used in situations involving the analysis of complex emission regions having more than three sources, even though it is substantially slower than current model-fitting tasks (at least 10,000 times slower for a single processor, depending on the number of sources to be optimized). As in the case of any model fitting performed in the image plane, caution is required in analyzing images constructed from a poorly sampled (u, v) plane.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Astronomy has evolved almost exclusively by the use of spectroscopic and imaging techniques, operated separately. With the development of modern technologies, it is possible to obtain data cubes in which one combines both techniques simultaneously, producing images with spectral resolution. To extract information from them can be quite complex, and hence the development of new methods of data analysis is desirable. We present a method of analysis of data cube (data from single field observations, containing two spatial and one spectral dimension) that uses Principal Component Analysis (PCA) to express the data in the form of reduced dimensionality, facilitating efficient information extraction from very large data sets. PCA transforms the system of correlated coordinates into a system of uncorrelated coordinates ordered by principal components of decreasing variance. The new coordinates are referred to as eigenvectors, and the projections of the data on to these coordinates produce images we will call tomograms. The association of the tomograms (images) to eigenvectors (spectra) is important for the interpretation of both. The eigenvectors are mutually orthogonal, and this information is fundamental for their handling and interpretation. When the data cube shows objects that present uncorrelated physical phenomena, the eigenvector`s orthogonality may be instrumental in separating and identifying them. By handling eigenvectors and tomograms, one can enhance features, extract noise, compress data, extract spectra, etc. We applied the method, for illustration purpose only, to the central region of the low ionization nuclear emission region (LINER) galaxy NGC 4736, and demonstrate that it has a type 1 active nucleus, not known before. Furthermore, we show that it is displaced from the centre of its stellar bulge.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

There is a gap in terms of the supposed survival differences recorded in the field according to individual condition. This is partly due to our inability to assess survival in the wild. Here we applied modern statistical techniques to field-gathered data in two damselfly species whose males practice alternative reproductive tactics (ARTs) and whose indicators of condition in both sexes are known. In Paraphlebia zoe, there are two ART: a larger black-winged (BW) male which defends mating territories and a smaller hyaline-winged (HW) male that usually acts as a satellite. In this species, condition in both morphs is correlated with body size. In Calopteryx haemorrhoidalis, males follow tactics according to their condition with males in better condition practicing a territorial ART. In addition, in this species, condition correlates positively with wing pigmentation in both sexes. Our prediction for both species was that males practicing the territorial tactic will survive less longer than males using a nonterritorial tactic, and larger or more pigmented animals will survive for longer. In P. zoe, BW males survived less than females but did not differ from HW males, and not necessarily larger individuals survived for longer. In fact, size affected survival but only when group identity was analysed, showing a positive relationship in females and a slightly negative relationship in both male morphs. For C. haemorrhoidalis, survival was larger for more pigmented males and females, but size was not a good survival predictor. Our results partially confirm assumptions based on the maintenance of ARTs. Our results also indicate that female pigmentation, correlates with a fitness component - survival - as proposed by recent sexual selection ideas applied to females.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A 172 cm-long sediment core was collected from a small pristine lake situated within a centripetal drainage basin in a tropical karst environment (Ribeira River valley, southeastern Brazil) in order to investigate the paleoenvironmental record provided by the lacustrine geochemistry. Sediments derived from erosion of the surrounding cambisoils contain quartz, kaolinite, mica, chlorite and goethite. Accelerator mass spectroscopy (AMS) (14)C dating provided the geochronological framework. Three major sedimentary units were identified based on the structure and color of the sediments: Unit III from 170 to 140 cm (1030 +/- 60-730 +/- 60 yr BP), Unit II from 140 to 90 cm (730 +/- 60-360 +/- 60 yr BP) and Unit I from 90 to 0 cm (360 +/- 60-0 yr BP). Results of major and trace element concentrations were analysed through multivariate statistical techniques. Factor analysis provided three factors accounting for 72.4% of the total variance. F1 and F2 have high positive loadings from K, Ba, Cs, Rb, Sr, Sc, Th, light rare earth element (LREE), Fe, Cr, Ti, Zr, Hf and Ta, and high negative loadings from Mg, Co, Cu, Zn, Br and loss on ignition (LOI). F3, with positive loadings from V and non-metals As and Sb, accounts for a low percentage (9.7%) of the total variance, being therefore of little interpretative use. The profile distribution of F1 scores reveals negative values in Units I and III, and positive values in Unit II, meaning that K, Ba, Cs, Rb, Sr, Sc, Th, LREE, Fe, Cr, Ti, Zr, Hf and Ta are relatively more concentrated in Unit II, and Mg, Co, Cu, Zn and Br are relatively more abundant in Units I and III. The observed fluctuations in the geochemical composition of the sediments are consistent with slight variations of the erosion intensity in the catchment area as a possible response to variations of climatic conditions during the last millennium. (c) 2009 Elsevier GmbH. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Research objectives Poker and responsible gambling both entail the use of the executive functions (EF), which are higher-level cognitive abilities. The main objective of this work was to assess if online poker players of different ability show different performances in their EF and if so, which functions are the most discriminating ones. The secondary objective was to assess if the EF performance can predict the quality of gambling, according to the Gambling Related Cognition Scale (GRCS), the South Oaks Gambling Screen (SOGS) and the Problem Gambling Severity Index (PGSI). Sample and methods The study design consisted of two stages: 46 Italian active players (41m, 5f; age 32±7,1ys; education 14,8±3ys) fulfilled the PGSI in a secure IT web system and uploaded their own hand history files, which were anonymized and then evaluated by two poker experts. 36 of these players (31m, 5f; age 33±7,3ys; education 15±3ys) accepted to take part in the second stage: the administration of an extensive neuropsychological test battery by a blinded trained professional. To answer the main research question we collected all final and intermediate scores of the EF tests on each player together with the scoring on the playing ability. To answer the secondary research question, we referred to GRCS, PGSI and SOGS scores.  We determined which variables that are good predictors of the playing ability score using statistical techniques able to deal with many regressors and few observations (LASSO, best subset algorithms and CART). In this context information criteria and cross-validation errors play a key role for the selection of the relevant regressors, while significance testing and goodness-of-fit measures can lead to wrong conclusions.   Preliminary findings We found significant predictors of the poker ability score in various tests. In particular, there are good predictors 1) in some Wisconsin Card Sorting Test items that measure flexibility in choosing strategy of problem-solving, strategic planning, modulating impulsive responding, goal setting and self-monitoring, 2) in those Cognitive Estimates Test variables related to deductive reasoning, problem solving, development of an appropriate strategy and self-monitoring, 3) in the Emotional Quotient Inventory Short (EQ-i:S) Stress Management score, composed by the Stress Tolerance and Impulse Control scores, and in the Interpersonal score (Empathy, Social Responsibility, Interpersonal Relationship). As for the quality of gambling, some EQ-i:S scales scores provide the best predictors: General Mood for the PGSI; Intrapersonal (Self-Regard; Emotional Self-Awareness, Assertiveness, Independence, Self-Actualization) and Adaptability  (Reality Testing, Flexibility, Problem Solving) for the SOGS, Adaptability for the GRCS. Implications for the field Through PokerMapper we gathered knowledge and evaluated the feasibility of the construction of short tasks/card games in online poker environments for profiling users’ executive functions. These card games will be part of an IT system able to dynamically profile EF and provide players with a feedback on their expected performance and ability to gamble responsibly in that particular moment. The implementation of such system in existing gambling platforms could lead to an effective proactive tool for supporting responsible gambling. 

Relevância:

60.00% 60.00%

Publicador:

Resumo:

No cenário atual, onde a globalização, aliada a um maior nível de exigência por parte do cliente, impõem às empresas um maior empenho por competitividade, a agdidade no desenvolvimento e otimização de produtos torna-se crucial para a sobrevivência das mesmas no mercado. Neste contexto, procurou-se compilar várias técnicas utilizadas em E n g e h dd Qd& em um método integrado para a Ot+o Expmmid de MWtwa. Essas técnicas fornecem resultados muito mais rápidos e econômicos do que a tradicional prática de variar um componente de cada vez na mistura, devido ao menor número de ensaios necessários. Entretanto, apesar de não serem tão recentes, as ferramentas aplicáveis à otimização de misturas não têm sido utilizadas pelo seu maior beneficiário (a indústria), provavelmente por falta de divulgação de sua existência, ou, principalmente, devido à complexidade dos cálculos envolvidos. Dessa forma, além do método proposto, desenvolveu-se também um sofiwa~q ue irnplementa todas os passos sugeridos, com o intuito de facilitar ainda mais a aplicação dos mesmos por pessoas não especializadas em técnicas estatísticas. Através do software (OptiMix), o método foi testado em uma situação real e em um estudo comparativo com um relato da literatura, a fim de testar sua validade, necessidade de adaptações e consistência dos resultados. A avaliaçio dos estudos de caso demonstrou que o método proposto fornece resultados coerentes com os de outras técnicas alternativas, com a vantagem de o usuário não precisar realizar cálculos, evitando assim, erros e agilizando o processo de otimização.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

O Departamento de Polícia Federal, em particular sua Diretoria Técnico-Científica, tem buscado adotar ferramentas modernas de administração para a melhoria da eficiência de seus processos, dentre as quais o uso de indicadores de desempenho. No caso específico da Criminalística, cuja gestão está a cargo daquela diretoria, ainda faltam estudos básicos que permitam conhecer os processos que lá se desencadeiam, para então serem adotados indicadores confiáveis e de fácil compreensão. Nessa direção, um dos primeiros passos é conhecer o tempo que cada processo demora, dadas suas características. Neste trabalho são utilizadas técnicas estatísticas para extrair da base de dados existente na Diretoria Técnico-Científica esta informação. Com a obtenção dessas informações é possível propor indicadores de desempenho adequados e de fácil acompanhamento, permitindo então aos gestores verificar o resultado efetivo de ações e decisões gerenciais.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Esta dissertação propõe um modelo para calcular o custo de curto prazo das rupturas de estoques para uma rede varejista de medicamentos do Brasil. O modelo proposto usa variáveis de influência que podem ser estimadas pelos gestores da empresa para prever as reações dos consumidores às rupturas e relaciona os custos para o varejista, associados a elas. Técnicas estatísticas multivariadas foram usadas para analisar as reações de consumidores em situações reais de rupturas de estoque, capturadas em diversas lojas da rede e relacioná-las com as variáveis de influência. A dissertação tem uma concepção pragmática e gera conhecimento para melhorar a forma como é dimensionado o estoque no varejo farmacêutico. O modelo proposto viabiliza a definição do nível de atendimento e de estoque que maximizam o resultado econômico para o varejista.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

No contexto de um mercado tão competitivo, ter equipes bem preparadas e alocadas adequadamente é fundamental para a sobrevivência das empresas. O presente estudo objetiva identificar o reflexo na satisfação dos clientes e nos resultados das empresas, a partir do conhecimento das pessoas que trabalham na linha de frente dessas empresas, aqueles profissionais que exercem um papel importante de negociação, identificando o que eles valorizam subjetivamente em uma negociação. Por meio da ferramenta SVI (Subjective Value Inventory), desenvolvida por Curhan et al (2006), a partir das dimensões de autoimagem independente e interdependente, busca-se identificar os valores subjetivos dos negociadores de um banco de varejo brasileiro, responsáveis por parte significativa das negociações e dos resultados da empresa, relativamente aos sentimentos sobre si mesmos (Self), aos resultados instrumentais, e ao processo e relacionamento (Rapport), utilizando a confiança interpessoal como moderadora nessa relação. Após identificados os valores subjetivos desses negociadores em negociação, relacionar os resultados encontrados com a satisfação dos clientes. Para isso, foi realizada uma pesquisa quantitativa, com a aplicação de um questionário fechado e estruturado a 532 negociadores desse banco que atuam nos estados de Santa Catarina, Rio de Janeiro e Maranhão, responsáveis pelo relacionamento, prospecção e realização de negócios com os clientes da instituição, nos segmentos pessoa física, micro e pequenas empresas e governo. Os dados foram analisados a partir de técnicas estatísticas, utilizando-se o método dos Mínimos Quadrados Parciais. Observou-se que mais de 40% da satisfação de cliente é explicada pelos valores subjetivos dos negociadores. O estudo apontou como resultados, dentre outros, que os gerentes de negócios com autoimagem independente valorizam o Self e os resultados instrumentais em uma negociação, e que a confiança interpessoal cognitiva modera negativamente essa relação. Ainda, que aqueles gerentes de negócios com autoimagem interdependente, valorizam o Rapport em uma negociação e que essa valorização está positivamente relacionada com a satisfação dos clientes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Com o objetivo de mostrar uma aplicação dos modelos da família GARCH a taxas de câmbio, foram utilizadas técnicas estatísticas englobando análise multivariada de componentes principais e análise de séries temporais com modelagem de média e variância (volatilidade), primeiro e segundo momentos respectivamente. A utilização de análise de componentes principais auxilia na redução da dimensão dos dados levando a estimação de um menor número de modelos, sem contudo perder informação do conjunto original desses dados. Já o uso dos modelos GARCH justifica-se pela presença de heterocedasticidade na variância dos retornos das séries de taxas de câmbio. Com base nos modelos estimados foram simuladas novas séries diárias, via método de Monte Carlo (MC), as quais serviram de base para a estimativa de intervalos de confiança para cenários futuros de taxas de câmbio. Para a aplicação proposta foram selecionadas taxas de câmbio com maior market share de acordo com estudo do BIS, divulgado a cada três anos.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

O fortalecimento da democracia passa necessariamente pelo acesso a informação. Esta vem sendo regulada por marcos legais ao longo das últimas décadas. Entretanto sabemos que a prática por vezes não corresponde à teoria. Verificar como os municípios brasileiros vêm implementando suas obrigações em face da LAI passa, além de avaliar seu cumprimento pela investigação das causas de eventuais desníveis em tal atendimento. Mais que isto, tentar identificar variáveis que impactem mais decisivamente tal implementação. A partir desta premissa, buscamos identificar tais fatores e quantificar o impacto de cada um deles no resultado de avaliação de um dos índices de transparência criados recentemente – o Escala Brasil Transparente (EBT). Cumpriu-se essa tarefa por meio das seguintes ações: (i) realização de pesquisa bibliográfica a partir da produção acadêmica; (ii) catalogação e análise dos instrumentos legais para transparência; (iii) levantamento das variáveis determinantes para a transparência; (iv) estabelecimento de relação entre tais variáveis e os índices de transparência encontrados, por meio de técnicas estatísticas, especialmente correlação e regressão. As pesquisas feitas constataram que a produção acadêmica da área de Administração Pública sobre o tema ainda é virtualmente inexistente e que as investigações da causalidade da transparência se faz necessária; que, dada a correlação encontrada entre as variáveis selecionadas e a transparência, as conclusões são por hipóteses promissoras que merecem estudos mais detalhados, com técnicas qualitativas a fim de uma determinação mais precisa de causa-efeito.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The principal purpose of this research was to investigate discriminant factors of survival and failure of micro and small businesses, and the impacts of these factors in the public politics for entrepreneurship in the State of Rio Grande do Norte. The data were ceded by SEBRAE/RN and the Commercial Committee of the Rio Grande do Norte State and it included the businesses that were registered in 2000, 2001 and 2002. According to the theoretical framework 3 groups of factors were defined Business Financial Structure, Entrepreneurial Preparation and Entrepreneurial Behavior , and the factors were studied in order to determine whether they are discriminant or not of the survival and business failure. A quantitative research was applied and advanced statistical techniques were used multivariate data analysis , beginning with the factorial analysis and after using the discriminant analysis. As a result, canonical discriminant functions were found and they partially explained the survival and business failure in terms of the factors and groups of factors. The analysis also permitted the evaluation of the public politics for entrepreneurship and it was verified, according to the view of the entrepreneurs, that these politics were weakly effective to avoid business failure. Some changes in the referred politics were suggested based on the most significant factors found.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Self-efficacy, the construct developed by Albert Bandura in 1977 and widely studied around the world, means the individual's belief in his own capacity to successfully perform a certain activity. This study aims to determine the degree of association between sociodemographic characteristics and professional training to the levels of Self-Efficacy at Work (SEW) of the Administrative Assistants in a federal university. This is a descriptive research submitted to and approved by the Ethics Committee of UFRN. The method of data analysis, in quantitative nature, was accomplished with the aid of the statistical programs R and Minitab. The instrument used in research was a sociodemographic data questionnaire, variables of professional training and the General Perception of Self-efficacy Scale (GPSES), applied to the sample by 289 Assistants in Administration. Statistical techniques for data analysis were descriptive statistics, cluster analysis, reliability test (Cronbach's alpha), and test of significance (Pearson). Results show a sociodemographic profile of Assistants in Administration of UFRN with well-distributed characteristics, with 48.4% men and 51.6% female; 59.9% of them were aged over 40 years, married (49.3%), color or race white (58%) and Catholics (67.8%); families are composed of up to four people (75.8%) with children (59.4%) of all age groups; the occupation of the mothers of these professionals is mostly housewives (51.6%) with high school education up to parents (72%) and mothers (75.8%). Assistants in Administration have high levels of professional training, most of them composed two groups of servers: the former, recently hired public servants (30.7%) and another with long service (59%), the majority enter young in career and it stays until retirement, 72.4% of these professionals have training above the minimum requirement for the job. The analysis of SEW levels shows medium to high levels for 72% of assistants in administration; low SEWclassified people have shown a high average of 2.7, considered close to the overall mean presented in other studies, which is 2.9. The cluster analysis has allowed us to say that the characteristics of the three groups (Low, Medium and High SEW) are similar and can be found in the three levels of SEW representatives with all the characteristics investigated. The results indicate no association between the sociodemographic variables and professional training to the levels of self-efficacy at work of Assistants in Administration of UFRN, except for the variable color or race. However, due to the small number of people who declared themselves in color or black race (4% of the sample), this result can be interpreted as mere coincidence or the black people addressed in this study have provided a sense of efficacy higher than white and brown ones. The study has corroborated other studies and highlighted the subjectivity of the self-efficacy construct. They are needed more researches, especially with public servants for the continuity and expansion of studies on the subject, making it possible to compare and confirm the results

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study shows the results of an exploratory-descriptive research that aimed to identify the latent dimensions of communication, as well as finding relations between such dimensions and organizational image. The sample came to a total of 267 respondents, being 89 managers or owners and 178 salespeople of clothing and footwear stores that are situated in the main five shopping centers located in Natal, capital of Rio Grande do Norte. The collection of the data was made by the use of two structuralized and validated instruments, being the answers measured in the likert scale of 6 points. For the measurement of communication it was used the instrument developed by Downs and Hazen (2002), made up of 8 latent dimensions and 32 indicators. For the image it was used the model of Mael and Ashforth (1992) that contains 5 indicators. The analysis of the data was made through of the use of statistical techniques of factorial analysis and structural equations modeling. The results of the factorial analysis demonstrated communication as being formed by five latent dimensions. The modeling, on the other hand, demonstrated to exist positive relations between communication and organizational image, whose results revealed that the image is influenced by the communication with the supervisor, by the organizational integration and as being stronger explained by the vertical communication