79 resultados para Posture Data
Resumo:
Data Mining (DM) methods are being increasingly used in prediction with time series data, in addition to traditional statistical approaches. This paper presents a literature review of the use of DM with time series data, focusing on short- time stocks prediction. This is an area that has been attracting a great deal of attention from researchers in the field. The main contribution of this paper is to provide an outline of the use of DM with time series data, using mainly examples related with short-term stocks prediction. This is important to a better understanding of the field. Some of the main trends and open issues will also be introduced.
Resumo:
Aim of the paper: The purpose of this paper is to examine human resources management practices (HRM practices) in small firms and to improve the understanding of the relationship between this kind of practices and business growth. This exploratory study is based on the resource-based view of the firm and empirical work carried out in two small firms by relating HRM practices with the firms’ results. Contribution to the literature: This is an in-depth study of HRM practices and its impact on performance growth in micro firms, isolating and controlling for most of the contextual and internal variables considered in the literature that relate HRM to growth. Firm growth analysis was broadened by the use of several dependent variables: employment growth and operational and financial performance growth. Some hypotheses for further research in identifying HRM practices in small business and its relation with firm growth are suggested. Methodology: Case study methodology was used to study two firms. The techniques used to collect data were semi-structured interviews to the owner and all the employees, unstructured observation at the firms’ facilities (during two days), entrepreneur profile definition (survey answer) and document data collection (on demographic characterization and performance results). Data was analyzed through content analysis methodology, and categories derived from the interviews’ protocols and literature. Results and implications: Results revealed that despite the firms’ organizational characteristics similarities, they differ significantly in owners’ motivation to grow, HRM practices and organizational performance and growth. Future studies should pay special attention to owner willingness to grow, to firms’ years of experience in business, to staff’s years of experience in their field of work and turnover. HRM practices in micro/small firms should be better defined and characterized. The external image of management posture relating to longitudinal financial results and growth should also be explored.
Resumo:
New arguments proving that successive (repeated) measurements have a memory and actually remember each other are presented. The recognition of this peculiarity can change essentially the existing paradigm associated with conventional observation in behavior of different complex systems and lead towards the application of an intermediate model (IM). This IM can provide a very accurate fit of the measured data in terms of the Prony's decomposition. This decomposition, in turn, contains a small set of the fitting parameters relatively to the number of initial data points and allows comparing the measured data in cases where the “best fit” model based on some specific physical principles is absent. As an example, we consider two X-ray diffractometers (defined in paper as A- (“cheap”) and B- (“expensive”) that are used after their proper calibration for the measuring of the same substance (corundum a-Al2O3). The amplitude-frequency response (AFR) obtained in the frame of the Prony's decomposition can be used for comparison of the spectra recorded from (A) and (B) - X-ray diffractometers (XRDs) for calibration and other practical purposes. We prove also that the Fourier decomposition can be adapted to “ideal” experiment without memory while the Prony's decomposition corresponds to real measurement and can be fitted in the frame of the IM in this case. New statistical parameters describing the properties of experimental equipment (irrespective to their internal “filling”) are found. The suggested approach is rather general and can be used for calibration and comparison of different complex dynamical systems in practical purposes.
Resumo:
Cloud data centers have been progressively adopted in different scenarios, as reflected in the execution of heterogeneous applications with diverse workloads and diverse quality of service (QoS) requirements. Virtual machine (VM) technology eases resource management in physical servers and helps cloud providers achieve goals such as optimization of energy consumption. However, the performance of an application running inside a VM is not guaranteed due to the interference among co-hosted workloads sharing the same physical resources. Moreover, the different types of co-hosted applications with diverse QoS requirements as well as the dynamic behavior of the cloud makes efficient provisioning of resources even more difficult and a challenging problem in cloud data centers. In this paper, we address the problem of resource allocation within a data center that runs different types of application workloads, particularly CPU- and network-intensive applications. To address these challenges, we propose an interference- and power-aware management mechanism that combines a performance deviation estimator and a scheduling algorithm to guide the resource allocation in virtualized environments. We conduct simulations by injecting synthetic workloads whose characteristics follow the last version of the Google Cloud tracelogs. The results indicate that our performance-enforcing strategy is able to fulfill contracted SLAs of real-world environments while reducing energy costs by as much as 21%.