796 resultados para Empirical Algorithm Analysis
Resumo:
This paper considers the use of Association Rule Mining (ARM) and our proposed Transaction based Rule Change Mining (TRCM) to identify the rule types present in tweet’s hashtags over a specific consecutive period of time and their linkage to real life occurrences. Our novel algorithm was termed TRCM-RTI in reference to Rule Type Identification. We created Time Frame Windows (TFWs) to detect evolvement statuses and calculate the lifespan of hashtags in online tweets. We link RTI to real life events by monitoring and recording rule evolvement patterns in TFWs on the Twitter network.
Resumo:
During the last 30 years, significant debate has taken place regarding multilevel research. However, the extent to which multilevel research is overtly practiced remains to be examined. This article analyzes 10 years of organizational research within a multilevel framework (from 2001 to 2011). The goals of this article are (a) to understand what has been done, during this decade, in the field of organizational multilevel research and (b) to suggest new arenas of research for the next decade. A total of 132 articles were selected for analysis through ISI Web of Knowledge. Through a broad-based literature review, results suggest that there is equilibrium between the amount of empirical and conceptual papers regarding multilevel research, with most studies addressing the cross-level dynamics between teams and individuals. In addition, this study also found that the time still has little presence in organizational multilevel research. Implications, limitations, and future directions are addressed in the end. Organizations are made of interacting layers. That is, between layers (such as divisions, departments, teams, and individuals) there is often some degree of interdependence that leads to bottom-up and top-down influence mechanisms. Teams and organizations are contexts for the development of individual cognitions, attitudes, and behaviors (top-down effects; Kozlowski & Klein, 2000). Conversely, individual cognitions, attitudes, and behaviors can also influence the functioning and outcomes of teams and organizations (bottom-up effects; Arrow, McGrath, & Berdahl, 2000). One example is when the rewards system of one organization may influence employees’ intention to quit and the existence or absence of extra role behaviors. At the same time, many studies have showed the importance of bottom-up emergent processes that yield higher level phenomena (Bashshur, Hernández, & González-Romá, 2011; Katz-Navon & Erez, 2005; Marques-Quinteiro, Curral, Passos, & Lewis, in press). For example, the affectivity of individual employees may influence their team’s interactions and outcomes (Costa, Passos, & Bakker, 2012). Several authors agree that organizations must be understood as multilevel systems, meaning that adopting a multilevel perspective is fundamental to understand real-world phenomena (Kozlowski & Klein, 2000). However, whether this agreement is reflected in practicing multilevel research seems to be less clear. In fact, how much is known about the quantity and quality of multilevel research done in the last decade? The aim of this study is to compare what has been proposed theoretically, concerning the importance of multilevel research, with what has really been empirically studied and published. First, this article outlines a review of the multilevel theory, followed by what has been theoretically “put forward” by researchers. Second, this article presents what has really been “practiced” based on the results of a review of multilevel studies published from 2001 to 2011 in business and management journals. Finally, some barriers and challenges to true multilevel research are suggested. This study contributes to multilevel research as it describes the last 10 years of research. It quantitatively depicts the type of articles being written, and where we can find the majority of the publications on empirical and conceptual work related to multilevel thinking.
Resumo:
This paper describes the methodology used to compile a corpus called MorphoQuantics that contains a comprehensive set of 17,943 complex word types extracted from the spoken component of the British National Corpus (BNC). The categorisation of these complex words was derived primarily from the classification of Prefixes, Suffixes and Combining Forms proposed by Stein (2007). The MorphoQuantics corpus has been made available on a website of the same name; it lists 554 word-initial and 281 word-final morphemes in English, their etymology and meaning, and records the type and token frequencies of all the associated complex words containing these morphemes from the spoken element of the BNC, together with their Part of Speech. The results show that, although the number of word-initial affixes is nearly double that of word-final affixes, the relative number of each observed in the BNC is very similar; however, word-final affixes are more productive in that, on average, the frequency with which they attach to different bases is three times that of word-initial affixes. Finally, this paper considers how linguists, psycholinguists and psychologists may use MorphoQuantics to support their empirical work in first and second language acquisition, and clinical and educational research.
Resumo:
Incoherent scatter data from non-thermal F-region ionospheric plasma are analysed, using theoretical spectra predicted by Raman et al. It is found that values of the semi-empirical drift parameter D∗, associated with deviations of the ion velocity distribution from a Maxwellian, and the plasma temperatures can be rigorously deduced (the results being independent of the path of iteration) if the angle between the line-of-sight and the geomagnetic field is larger than about 15–20 degrees. For small aspect angles, the deduced value of the average (or 3-D) ion temperature remains ambiguous and the analysis is restricted to the determination of the line-of-sight temperature because the theoretical spectrum is insensitive to non-thermal effects when the plasma is viewed along directions almost parallel to the magnetic field. This limitation is expected to apply to any realistic model of the ion velocity distribution, and its consequences are discussed. Fit strategies which allow for mixed ion composition are also considered. Examples of fits to data from various EISCAT observing programmes are presented.
Resumo:
Given a dataset of two-dimensional points in the plane with integer coordinates, the method proposed reduces a set of n points down to a set of s points s ≤ n, such that the convex hull on the set of s points is the same as the convex hull of the original set of n points. The method is O(n). It helps any convex hull algorithm run faster. The empirical analysis of a practical case shows a percentage reduction in points of over 98%, that is reflected as a faster computation with a speedup factor of at least 4.
Resumo:
Large waves pose risks to ships, offshore structures, coastal infrastructure and ecosystems. This paper analyses 10 years of in-situ measurements of significant wave height (Hs) and maximum wave height (Hmax) from the ocean weather ship Polarfront in the Norwegian Sea. During the period 2000 to 2009, surface elevation was recorded every 0.59 s during sampling periods of 30 min. The Hmax observations scale linearly with Hs on average. A widely-used empirical Weibull distribution is found to estimate average values of Hmax/Hs and Hmax better than a Rayleigh distribution, but tends to underestimate both for all but the smallest waves. In this paper we propose a modified Rayleigh distribution which compensates for the heterogeneity of the observed dataset: the distribution is fitted to the whole dataset and improves the estimate of the largest waves. Over the 10-year period, the Weibull distribution approximates the observed Hs and Hmax well, and an exponential function can be used to predict the probability distribution function of the ratio Hmax/Hs. However, the Weibull distribution tends to underestimate the occurrence of extremely large values of Hs and Hmax. The persistence of Hs and Hmax in winter is also examined. Wave fields with Hs>12 m and Hmax>16 m do not last longer than 3 h. Low-to-moderate wave heights that persist for more than 12 h dominate the relationship of the wave field with the winter NAO index over 2000–2009. In contrast, the inter-annual variability of wave fields with Hs>5.5 m or Hmax>8.5 m and wave fields persisting over ~2.5 days is not associated with the winter NAO index.
Resumo:
This paper investigates the behavior of residential property and examines the linkages between house price dynamics and bank herding behavior. The analysis presents evidence that irrational behaviour may have played a significant role in several countries, including; United Kingdom, Spain, Denmark, Sweden and Ireland. In addition, we also provide evidence indicative of herding behaviour in the European residential mortgage loan market. Granger Causality tests indicate that non-fundamentally justified prices dynamics contributed to herding by lenders and that this behaviour was a response by the banks as a group to common information on residential property assets. In contrast, in Germany, Portugal and Austria, residential property prices were largely explained by fundamentals. Furthermore, these countries show no evidence of either irrational price bubbles or herd behaviour in the mortgage market. Granger Causality tests indicate that both variables are independent.
Resumo:
This paper investigates the price effect of EPC ratings on the residential dwelling prices in Wales. It examines the capitalisation of energy efficiency ratings into house prices using two approaches. The first adopts a cross-sectional framework to investigate the effect of EPC band (and EPC rating) on a large sample of dwelling transactions. The second approach is based on a repeat-sales methodology to examine the impact of EPC band and rating on house price appreciation. The results show that, controlling for other price influencing dwelling characteristics, EPC band does affect house prices. This observed influence of EPC on price may not be a result of energy performance alone; the effect may be due to non-energy related benefits associated with certain types, specifications and ages of dwellings or there may be unobserved quality differences unrelated to energy performance such as better quality fittings and materials. An analysis of the private rental segment reveals that, in contrast to the general market, low-EPC rated properties were not traded at a significant discount, suggesting different implicit prices of potential energy savings for landlords and owner-occupiers.
Resumo:
In this paper we present a novel approach to detect people meeting. The proposed approach works by translating people behaviour from trajectory information into semantic terms. Having available a semantic model of the meeting behaviour, the event detection is performed in the semantic domain. The model is learnt employing a soft-computing clustering algorithm that combines trajectory information and motion semantic terms. A stable representation can be obtained from a series of examples. Results obtained on a series of videos with different types of meeting situations show that the proposed approach can learn a generic model that can effectively be applied on the behaviour recognition of meeting situations.
Resumo:
In this paper, we investigate the pricing of crack spread options. Particular emphasis is placed on the question of whether univariate modeling of the crack spread or explicit modeling of the two underlyings is preferable. Therefore, we contrast a bivariate GARCH volatility model for cointegrated underlyings with the alternative of modeling the crack spread directly. Conducting an empirical analysis of crude oil/heating oil and crude oil/gasoline crack spread options traded on the New York Mercantile Exchange, the more simplistic univariate approach is found to be superior with respect to option pricing performance.
Resumo:
This paper investigates the effect of Energy Performance Certificate (EPC) ratings on residential prices in Wales. Drawing on a sample of approximately 192,000 transactions, the capitalisation of energy efficiency ratings into house prices is investigated using two approaches. The first adopts a cross-sectional framework to investigate the effect of EPC rating on price. The second approach applies a repeat-sales methodology to investigate the impact of EPC rating on house price appreciation. Statistically significant positive price premiums are estimated for dwellings in EPC bands A/B (12.8%) and C (3.5%) compared to houses in band D. For dwellings in band E (−3.6%) and F (−6.5%) there are statistically significant discounts. Such effects may not be the result of energy performance alone. In addition to energy cost differences, the price effect may be due to additional benefits of energy efficient features. An analysis of the private rental segment reveals that, in contrast to the general market, low-EPC rated dwellings were not traded at a significant discount. This suggests different implicit prices of potential energy savings for landlords and owner-occupiers.
Resumo:
Purpose – The purpose of this paper is to seek to shed light on the practice of incomplete corporate disclosure of quantitative Greenhouse gas (GHG) emissions and investigates whether external stakeholder pressure influences the existence, and separately, the completeness of voluntary GHG emissions disclosures by 431 European companies. Design/methodology/approach – A classification of reporting completeness is developed with respect to the scope, type and reporting boundary of GHG emissions based on the guidelines of the GHG Protocol, Global Reporting Initiative and the Carbon Disclosure Project. Logistic regression analysis is applied to examine whether proxies for exposure to climate change concerns from different stakeholder groups influence the existence and/or completeness of quantitative GHG emissions disclosure. Findings – From 2005 to 2009, on average only 15 percent of companies that disclose GHG emissions report them in a manner that the authors consider complete. Results of regression analyses suggest that external stakeholder pressure is a determinant of the existence but not the completeness of emissions disclosure. Findings are consistent with stakeholder theory arguments that companies respond to external stakeholder pressure to report GHG emissions, but also with legitimacy theory claims that firms can use carbon disclosure, in this case the incomplete reporting of emissions, as a symbolic act to address legitimacy exposures. Practical implications – Bringing corporate GHG emissions disclosure in line with recommended guidelines will require either more direct stakeholder pressure or, perhaps, a mandated disclosure regime. In the meantime, users of the data will need to carefully consider the relevance of the reported data and develop the necessary competencies to detect and control for its incompleteness. A more troubling concern is that stakeholders may instead grow to accept less than complete disclosure. Originality/value – The paper represents the first large-scale empirical study into the completeness of companies’ disclosure of quantitative GHG emissions and is the first to analyze these disclosures in the context of stakeholder pressure and its relation to legitimation.
Resumo:
Considering the sea ice decline in the Arctic during the last decades, polynyas are of high research interest since these features are core areas of new ice formation. The determination of ice formation requires accurate retrieval of polynya area and thin-ice thickness (TIT) distribution within the polynya.We use an established energy balance model to derive TITs with MODIS ice surface temperatures (Ts) and NCEP/DOE Reanalysis II in the Laptev Sea for two winter seasons. Improvements of the algorithm mainly concern the implementation of an iterative approach to calculate the atmospheric flux components taking the atmospheric stratification into account. Furthermore, a sensitivity study is performed to analyze the errors of the ice thickness. The results are the following: 1) 2-m air temperatures (Ta) and Ts have the highest impact on the retrieved ice thickness; 2) an overestimation of Ta yields smaller ice thickness errors as an underestimation of Ta; 3) NCEP Ta shows often a warm bias; and 4) the mean absolute error for ice thicknesses up to 20 cm is ±4.7 cm. Based on these results, we conclude that, despite the shortcomings of the NCEP data (coarse spatial resolution and no polynyas), this data set is appropriate in combination with MODIS Ts for the retrieval of TITs up to 20 cm in the Laptev Sea region. The TIT algorithm can be applied to other polynya regions and to past and future time periods. Our TIT product is a valuable data set for verification of other model and remote sensing ice thickness data.
Resumo:
A new sparse kernel density estimator is introduced based on the minimum integrated square error criterion combining local component analysis for the finite mixture model. We start with a Parzen window estimator which has the Gaussian kernels with a common covariance matrix, the local component analysis is initially applied to find the covariance matrix using expectation maximization algorithm. Since the constraint on the mixing coefficients of a finite mixture model is on the multinomial manifold, we then use the well-known Riemannian trust-region algorithm to find the set of sparse mixing coefficients. The first and second order Riemannian geometry of the multinomial manifold are utilized in the Riemannian trust-region algorithm. Numerical examples are employed to demonstrate that the proposed approach is effective in constructing sparse kernel density estimators with competitive accuracy to existing kernel density estimators.
Resumo:
This paper demonstrates by means of joint time-frequency analysis that the acoustic noise produced by the breaking of biscuits is dependent on relative humidity and water activity. It also shows that the time-frequency coefficients calculated using the adaptive Gabor transformation algorithm is dependent on the period of time a biscuit is exposed to humidity. This is a new methodology that can be used to assess the crispness of crisp foods. (c) 2007 Elsevier Ltd. All rights reserved.