1000 resultados para Rademacher averages


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In an influential paper, Pesaran [Pesaran, M.H. (2006). Estimation and inference in large heterogeneous panels with a multifactor error structure. Econometrica 74, 967–1012] proposes a very simple estimator of factor-augmented regressions that has since then become very popular. In this note we demonstrate how the presence of correlated factor loadings can render this estimator inconsistent.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is currently no universally recommended and accepted method of data processing within the science of indirect calorimetry for either mixing chamber or breath-by-breath systems of expired gas analysis. Exercise physiologists were first surveyed to determine methods used to process oxygen consumption ([OV0312]O 2) data, and current attitudes to data processing within the science of indirect calorimetry. Breath-by-breath datasets obtained from indirect calorimetry during incremental exercise were then used to demonstrate the consequences of commonly used time, breath and digital filter post-acquisition data processing strategies. Assessment of the variability in breath-by-breath data was determined using multiple regression based on the independent variables ventilation (VE), and the expired gas fractions for oxygen and carbon dioxide, FEO 2 and FECO2, respectively. Based on the results of explanation of variance of the breath-by-breath [OV0312]O2 data, methods of processing to remove variability were proposed for time-averaged, breath-averaged and digital filter applications. Among exercise physiologists, the strategy used to remove the variability in sequential [OV0312]O2 measurements varied widely, and consisted of time averages (30 sec [38%], 60 sec [18%], 20 sec [11%], 15 sec [8%]), a moving average of five to 11 breaths (10%), and the middle five of seven breaths (7%). Most respondents indicated that they used multiple criteria to establish maximum [OV0312]O 2 ([OV0312]O2max) including: the attainment of age-predicted maximum heart rate (HRmax) [53%], respiratory exchange ratio (RER) >1.10 (49%) or RER >1.15 (27%) and a rating of perceived exertion (RPE) of >17, 18 or 19 (20%). The reasons stated for these strategies included their own beliefs (32%), what they were taught (26%), what they read in research articles (22%), tradition (13%) and the influence of their colleagues (7%). The combination of VE, FEO 2 and FECO2 removed 96-98% of [OV0312]O2 breath-by-breath variability in incremental and steady-state exercise [OV0312]O2 data sets, respectively. Correction of residual error in [OV0312]O2 datasets to 10% of the raw variability results from application of a 30-second time average, 15-breath running average, or a 0.04 Hz low cut-off digital filter. Thus, we recommend that once these data processing strategies are used, the peak or maximal value becomes the highest processed datapoint. Exercise physiologists need to agree on, and continually refine through empirical research, a consistent process for analysing data from indirect calorimetry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The mean defined by Bonferroni in 1950 (known by the same name) averages all non-identical product pairs of the inputs. Its generalizations to date have been able to capture unique behavior that may be desired in some decision-making contexts such as the ability to model mandatory requirements. In this paper, we propose a composition that averages conjunctions between the respective means of a designated subset-size partition. We investigate the behavior of such a function and note the relationship within a given family as the subset size is changed. We found that the proposed function is able to more intuitively handle multiple mandatory requirements or mandatory input sets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This report summarizes the development of an occupational exposure database and surveillance system for use by health and safety professionals at Rocky Flats Environmental Technology Site (RFETS), a former nuclear weapons production facility. The site itself is currently in the cleanup stage with work expected to continue into 2006. The system was developed with the intent of helping health and safety personnel not only to manage and analyze exposure monitoring data, but also to identify exposure determinants during the highly variable cleanup work. Utilizing a series of focused meetings with health and safety personnel from two of the major contractors at RFETS, core data elements were established. These data elements were selected based on their utility for analysis and identification of exposure determinants. A task-based coding scheme was employed to better define the highly variable work. The coding scheme consisted of a two-tiered hierarchical list with a total of 34 possible combinations of work type and task. The data elements were incorporated into a Microsoft Access database with built-in data entry features to both promote consistency and limit entry choices to enable stratified analyses. In designing the system, emphasis was placed on the ability of end users to perform complex analyses and multiparameter queries to identify trends in their exposure data. A very flexible and user-friendly report generator was built into the system. This report generator allowed users to perform multiparameter queries using an intuitive system with very little training. In addition, a number of automated graphical analyses were built into the system, including ex posure levels by any combination of building, date, employee, job classification, type of contaminant, work type or task, exposure levels over time, exposure levels relative to the permissible exposure limit (PELS), and distributions of exposure levels. Both of these interfaces, allow the user to ''drill down'' or gradually narrow query criteria to identify specific exposure determinants. A number of other industrial hygiene processes were automated by the use of this database. Exposure calculations were coded into the system to allow automatic calculation of time-weighted averages and sample volumes. In addition, a table containing all the PELs and other relevant occupational exposure limits was built into the system to allow automatic comparisons with the current standards. Finally, the process of generating reports for employee notification was automated. The implementation of this system demonstrates that an integrated database system can save time for a practicing hygienist as well as provide useful and more importantly, timely information to guide primary prevention efforts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Averaging behaviour of aggregation functions depends on the fundamental property of monotonicity with respect to all arguments. Unfortunately this is a limiting property that ensures that many important averaging functions are excluded from the theoretical framework. We propose a definition for weakly monotone averaging functions to encompass the averaging aggregation functions in a framework with many commonly used non-monotonic means. Weakly monotonic averages are robust to outliers and noise, making them extremely important in practical applications. We show that several robust estimators of location are actually weakly monotone and we provide sufficient conditions for weak monotonicity of the Lehmer and Gini means and some mixture functions. In particular we show that mixture functions with Gaussian kernels, which arise frequently in image and signal processing applications, are actually weakly monotonic averages. Our concept of weak monotonicity provides a sound theoretical and practical basis for understanding both monotone and non-monotone averaging functions within the same framework. This allows us to effectively relate these previously disparate areas of research and gain a deeper understanding of averaging aggregation methods. © Springer International Publishing Switzerland 2014.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Monotonicity with respect to all arguments is fundamental to the definition of aggregation functions, which are one of the basic tools in knowledge-based systems. The functions known as means (or averages) are idempotent and typically are monotone, however there are many important classes of means that are non-monotone. Weak monotonicity was recently proposed as a relaxation of the monotonicity condition for averaging functions. In this paper we discuss the concepts of directional and cone monotonicity, and monotonicity with respect to majority of inputs and coalitions of inputs. We establish the relations between various kinds of monotonicity, and illustrate it on various examples. We also provide a construction method for cone monotone functions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pixel-scale fine details are often lost during image processing tasks such as image reduction and filtering. Block or region based algorithms typically rely on averaging functions to implement the required operation and traditional function choices struggle to preserve small, spatially cohesive clusters of pixels which may be corrupted by noise. This article proposes the construction of fuzzy measures of cluster compactness to account for the spatial organisation of pixels. We present two construction methods (minimum spannning trees and fuzzy measure decomposition) to generate measures with specific properties: monotonicity with respect to cluster size; invariance with respect to translation, reflection and rotation; and, discrimination between pixel sets of fixed cardinality with different spatial arrangements. We apply these measures within a non-monotonic mode-like averaging function used for image reduction and we show that this new function preserves pixel-scale structures better than existing monotonie averages.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Weak monotonicity was recently proposed as a relaxation of the monotonicity condition for averaging aggregation, and weakly monotone functions were shown to have desirable properties when averaging data corrupted with outliers or noise. We extended the study of weakly monotone averages by analyzing their ϕ-transforms, and we established weak monotonicity of several classes of averaging functions, in particular Gini means and mixture operators. Mixture operators with Gaussian weighting functions were shown to be weakly monotone for a broad range of their parameters. This study assists in identifying averaging functions suitable for data analysis and image processing tasks in the presence of outliers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Very little is known about the local power of second generation panel unit root tests that are robust to cross-section dependence. This article derives the local asymptotic power functions of the cross-section argumented Dickey–Fuller Cross-section Augmented Dickey-Fuller (CADF) and CIPS tests of Pesaran (2007), which are among the most popular tests around.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este trabalho tem por objetivo apresentar a fundamentação teórica e efetuar uma aplicação prática de uma das mais importantes descobertas no campo das finanças: o modelo de precificação de ativos de capital padrão, denominado de Capital Asset Price Model (CAPM). Na realização da aplicação prática, comparou-se a performance entre os retornos dos investimentos exigidos pelo referido modelo e os realmente obtidos. Foram analisadas cinco ações com a maior participação relativa na carteira teórica do Ibovespa e com retornos publicados de junho de 1998 a maio de 2001. Os dados foram obtidos da Economática da UFRGS e testados utilizando-se o Teste-t (duas amostras em par para médias) na ferramenta MS Excel. Os resultados foram tabelados e analisados, de onde se concluiu que, estatisticamente, com índice de confiança de 95%, não houve diferença de performance entre os retornos esperados e os realmente obtidos dos ativos objeto desta dissertação, no período estudado.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We apply the concept of exchangeable random variables to the case of non-additive robability distributions exhibiting ncertainty aversion, and in the lass generated bya convex core convex non-additive probabilities, ith a convex core). We are able to rove two versions of the law of arge numbers (de Finetti's heorems). By making use of two efinitions. of independence we rove two versions of the strong law f large numbers. It turns out that e cannot assure the convergence of he sample averages to a constant. e then modal the case there is a true" probability distribution ehind the successive realizations of the uncertain random variable. In this case convergence occurs. This result is important because it renders true the intuition that it is possible "to learn" the "true" additive distribution behind an uncertain event if one repeatedly observes it (a sufficiently large number of times). We also provide a conjecture regarding the "Iearning" (or updating) process above, and prove a partia I result for the case of Dempster-Shafer updating rule and binomial trials.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The new economy stimulated by the bigger Internet participation as an interaction mean between people and business has been changing the companies¿ management routine. Nowadays, Internet access goes through telephones, in the houses and offices, or through nets of the Intranet kind, with the users in their working settings. The Internet personal access added, with mobility, surely will bring changes to our contacting form to the outside world as well as it will create a possibility of giving the customers a bigger variety of products and services. The present study aims are: (i) to verify the customers behaviours related to this new media, presented in the mobiles digital screens; (ii) to explore some of their main features, and (iii) to compare them with the other media features, as the radio and the TV ones. The applied research methodology was the data collecting through phone interviews. To obtain a Confidence Interval of 95%, 470 WAP, Internet, radio and TV current users¿ answers were gotten. To compare the media, the Fishbein multi-attributes model was used what possibilities to form comparative scores of the attributes built for the research. To mark the discussion concerning the public behaviour related to the differentiated use of the mentioned media out, the study produced uses the McLuhan theoretical reference (1969), specially concerning to his theoretical building, denominated hot and cold means. But this referential theoretic mark could not be supported by the accomplished quantitative study result. The comparison of built attribute scores averages has differentiated the hot media WAP, like radio, as well as the cold media one, like TV. This way, showing that the access new technology appearance, WAP, not only enlarges the Internet use, but also appoints to a new theoretic classification possibility to this new media. The accomplished quantitative research revealed that, through the user point of view, the ¿Contents¿ category is considered as one of the most important WAP aspects. The TV and the radio received significantly lower grades in this subject. The public considers that is important that the WAP information is ¿trustworthy¿, ¿easy¿ to find, ¿available¿, ¿sufficient¿ and that attends the ¿urgency¿ expected by the user. It was observed that the WAP score ¿emotion¿ is inferior to the radio and TV and superior to the Internet ones. But, the differences found are not significant. Considering the low score of importance given to the attributes group ¿emotion¿, is not recommended the WAP use as media when the emotional users aspects are wanted to be reached.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As organizações estão se conscientizando que a mudança decorrente da transição de estruturas tradicionais para funcionais, da implementação de estruturas, assim como da união de culturas organizacionais, está repleta de riscos. Esta mudança vem em decorrência de downsizing, fusões, incorporações, cisões, joint ventures, entre outras alternativas administrativas, societárias e comerciais praticadas. Com a necessidade de adaptar rapidamente o negócio às exigências externas, os riscos, muitas vezes, não são analisados ou o são superficial ou parcialmente, resultando na elevação dos mesmos e expondo os processos de negócios a potenciais fraudes. O ambiente de controle do negócio tem se mostrado uma área de preocupação, principalmente nos momentos de transição estrutural e organizacional, pelo desconhecimento conceitual do risco e da importância do controle, como também pela forma de implementação das mudanças. Verifica-se também que há empresas, normalmente as grandes, que possuem um sistema estruturado de controles implementado e outras, normalmente as médias e pequenas, que não o possuem, onde, de acordo com pesquisas realizadas, encontra-se um maior número de fraudes, que, proporcionalmente ao seu patrimônio, representa uma perda substancial aos seus negócios. Este estudo objetiva abordar a evidência de contribuição de um sistema estruturado de controle para a minimização de ocorrência de fraudes nas organizações.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A pesquisa em tecnologia de sementes contribui para a manutenção de bancos de germoplasma. Kelissa brasiliensis (Baker) Ravenna e Sinningia lineata (Hjelmq.) Chautems são espécies nativas com potencial ornamental. Os objetivos deste trabalho foram desenvolver testes para a germinação de sementes, conhecer a qualidade fisiológica de lotes coletados in situ, desenvolver protocolos de propagação vegetativa e conhecer estratégias reprodutivas no ambiente. Avaliaram-se o vigor, bem como o comportamento germinativo das sementes em diferentes substratos, temperaturas e condições de luz. Foram realizados experimentos com propagação vegetativa de S. lineata. O trabalho foi conduzido no Laboratório de Sementes da Fepagro, no Jardim Botânico de Porto Alegre e na Faculdade de Agronomia da UFRGS. Durante as coletas, foi feita a observação dos nichos das espécies. O delineamento experimental foi completamente casualizado com quatro repetições de 10 e 20 sementes para K. brasiliensis e S. lineata, respectivamente; para S. lineata, três tratamentos com 17 repetições foram utilizados no experimento com estaquia, e nove tratamentos com quatro repetições para o teste de divisão dos tubérculos. A comparação das médias foi realizada através do teste de Duncan (P<0,05). A temperatura mais adequada para a germinação de sementes de K. brasiliensis foi 10ºC, o que explica a sua restrita distribuição ao bioma Pampa; a combinação 24h/25ºC levou a uma liberação maior de lixiviados na condutividade elétrica; o envelhecimento acelerado (72 h em 41ºC e 100% de umidade) não provocou redução significativa no percentual de germinação das sementes de K. brasiliensis. Já as sementes de S. lineata não germinaram após serem expostas ao estresse. A temperatura mais adequada para o teste de germinação de sementes de S. lineata foi 20ºC; para ambas espécies, o substrato papel e a presença de luz foram as condições mais adequadas para a germinação. S. lineata demonstrou ser facilmente propagada por sementes e via assexuada, o que revelou a rusticidade característica das espécies rupícolas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Neste estudo foram consideradas as teorias da atribuição de causalidade e da dissonância cognitiva, tendo como referência a necessidade psíquica de auto-consistência. O exame dos principais aspectos teóricos e empíricos destas duas teorias permitiu a conclusão de que as mesmas podem ser consideradas como mutuamente compatíveis e complementares. Foi realizado um experimento no qual os sujeitos, previamente diferenciados em instáveis e estáveis emocionalmente, foram submetidos a uma condição experimental fortemente dissonante, em relação ao auto-conceito (resolução de sete itens de um teste de inteligência, sendo cinco dos mesmos insolúveis). Foi realizada uma tarefa introdutória, da mesma natureza que a experimental, porém com itens solúveis, de mediana dificuldade. Foram medidos o número de acertos esperados e estimados (antes e após cada tarefa) além de se oferecer, a cada sujeito, a escolha, em uma listagem atribuicional, do motivo que, segundo ele, concorreu para o resultado por ele estimado, após a tarefa experimental. Os resultados indicaram que não houve diferença significativa entre as médias dos resultados estimados, dos dois grupos, após a tarefa experimental. As atribuições de causalidade dos dois grupos foram significativamente distintas, inclinando-se os estáveis para a internaIidade e os instáveis para a externalidade. Foram constatados, entretanto, diferenças bastante significativas entre as médias dos resultados esperados, antes das tarefas introdutória e experimental (média dos instáveis menor que a dos estáveis). Estes resultados foram interpretados como indicando uma intensa e precoce vivência dissonante, ao nível das expectativas dos resultados, por parte dos instáveis. A redução da dissonância ocorreu, preponderantemente, através da subestimação do número de acertos esperados nas duas tarefas propostas. A conclusão mais abrangente, deste estudo, colocou em relevo a grande importância das significações pessoais sobre os estímulos percebidos para a eliciação de respostas. Foram sugeridos novos estudos para melhor compreensão do assunto.