980 resultados para empirical correlation
Resumo:
An approximate analysis of gas absorption with instantaneous reaction in a liquid layer of finite thickness in plug flow is presented. An approximate solution to the enhancement factor for the case of unequal diffusivities between the dissolved gas and the liquid reactant has been derived and validated by numerical simulation. Depending on the diffusivity ratio of the liquid reactant to the dissolved gas (?), the enhancement factor tends to be either lower or higher than the prediction of the classical enhancement factor equation based on the penetration theory (Ei,pen) at Fourier numbers typically larger than 0.1. An empirical correlation valid for all Fourier numbers is proposed to allow a quick estimation of the enhancement factor, which describes the prediction of the approximate solution and the simulation data with a relative error below 5?% under the investigated conditions (? = 0.34, Ei,pen = 21000).
Resumo:
Este trabalho é parte integrante de uma linha de pesquisa destinada ao estudo de viabilidade técnica de melhoramento artificial de camadas de solo. Objetiva-se com este trabalho contribuir para a viabilização de uso de solos melhorados para suporte de fundações superficiais. O estudo baseou-se em resultados experimentais de provas de carga em placas circulares de 0,30m e 0,60m de diâmetro sobre camadas de solo melhorado com cimento (teor de 5%) de 0,15m, 0,30111 e 0,60m de espessura. Os diâmetros das placas (D) e as espessuras das camadas de solo melhorado com cimento (H) foram fixados de forma a obter-se três valores distintos da relação H/D, correspondendo a 0,5, 1 e 2. Os resultados, representados adimensionalmente através de relações entre a tensão normalizada e o recalque relativo, demonstram a influência da espessura da camada de solo melhorado no comportamento de fbndações superficiais submetidas a carregamento vertical. Uma correlação de natureza semiempirica é desenvolvida de forma a permitir a previsão da magnitude de recalques e tensões de ruptura de sapatas a partir de resultados de ensaios de placa. Foram também avaliados a aplicabiidade de modelos analíticos para fundações superficiais assentes em perfis de solos não homegêneos com características coesivo-fnccionais. Neste sentido, apresenta-se urna comparação quantitativa e qualitativa entre os diversos métodos de previsão da capacidade de suporte e recalques, bem como uma validação das proposições através de comparações entre resultados calculados e medidos experimentalmente em campo. Os principais resultados obtidos na pesquisa são: [a] melhora de desempenho das fundações quando apoiadas em solos tratados, [b] dificuldade de previsão das cargas de ruptura e níveis de recalques em fundações apoiadas em solos estratifcados através de métodos analíticos, refletindo a complexidade deste problema de interação solo-estrutura e [c] desenvolvimento de uma metodologia semi-empírica para estimativa do comportamento de fùndações superíiciais com base em resultados de ensaios de placa.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
The accurate determination of thermophysical properties of milk is very important for design, simulation, optimization, and control of food processing such as evaporation, heat exchanging, spray drying, and so forth. Generally, polynomial methods are used for prediction of these properties based on empirical correlation to experimental data. Artificial neural networks are better Suited for processing noisy and extensive knowledge indexing. This article proposed the application of neural networks for prediction of specific heat, thermal conductivity, and density of milk with temperature ranged from 2.0 to 71.0degreesC, 72.0 to 92.0% of water content (w/w), and 1.350 to 7.822% of fat content (w/w). Artificial neural networks presented a better prediction capability of specific heat, thermal conductivity, and density of milk than polynomial modeling. It showed a reasonable alternative to empirical modeling for thermophysical properties of foods.
Resumo:
A central goal in unsaturated soil mechanics research is to create a smooth transition between traditional soil mechanics approaches and an approach that is applicable to unsaturated soils. Undrained shear strength and the liquidity index of reconstituted or remoulded saturated soils are consistently correlated, which has been demonstrated by many studies. In the liquidity index range from 1 (at w(l)) to 0 (at w(p)), the shear strength ranges from approximately 2 kPa to 200 kPa. Similarly, for compacted soil, the shear strength at the plastic limit ranges from 150 kPa to 250 kPa. When compacted at their optimum water content, most soils have a suction that ranges from 20 kPa to 500 kPa; however, in the field, compacted materials are subjected to drying and wetting, which affect their initial suction and as a consequence their shear strength. Unconfined shear tests were performed on five compacted tropical soils and kaolin. Specimens were tested in the as-compacted condition, and also after undergoing drying or wetting. The test results and data from prior literature were examined, taking into account the roles of void ratio, suction, and relative water content. An interpretation of the phenomena that are involved in the development of the undrained shear strength of unsaturated soils in the contexts of soil water retention and Atterberg limits is presented, providing a practical view of the behaviour of compacted soil based on the concept of unsaturated soil. Finally, an empirical correlation is presented that relates the unsaturated state of compacted soils to the unconfined shear strength.
Resumo:
Biosignal measurement and processing is increasingly being deployed in ambulatory situations particularly in connected health applications. Such an environment dramatically increases the likelihood of artifacts which can occlude features of interest and reduce the quality of information available in the signal. If multichannel recordings are available for a given signal source, then there are currently a considerable range of methods which can suppress or in some cases remove the distorting effect of such artifacts. There are, however, considerably fewer techniques available if only a single-channel measurement is available and yet single-channel measurements are important where minimal instrumentation complexity is required. This paper describes a novel artifact removal technique for use in such a context. The technique known as ensemble empirical mode decomposition with canonical correlation analysis (EEMD-CCA) is capable of operating on single-channel measurements. The EEMD technique is first used to decompose the single-channel signal into a multidimensional signal. The CCA technique is then employed to isolate the artifact components from the underlying signal using second-order statistics. The new technique is tested against the currently available wavelet denoising and EEMD-ICA techniques using both electroencephalography and functional near-infrared spectroscopy data and is shown to produce significantly improved results. © 1964-2012 IEEE.
Resumo:
In this paper, we extend the debate concerning Credit Default Swap valuation to include time varying correlation and co-variances. Traditional multi-variate techniques treat the correlations between covariates as constant over time; however, this view is not supported by the data. Secondly, since financial data does not follow a normal distribution because of its heavy tails, modeling the data using a Generalized Linear model (GLM) incorporating copulas emerge as a more robust technique over traditional approaches. This paper also includes an empirical analysis of the regime switching dynamics of credit risk in the presence of liquidity by following the general practice of assuming that credit and market risk follow a Markov process. The study was based on Credit Default Swap data obtained from Bloomberg that spanned the period January 1st 2004 to August 08th 2006. The empirical examination of the regime switching tendencies provided quantitative support to the anecdotal view that liquidity decreases as credit quality deteriorates. The analysis also examined the joint probability distribution of the credit risk determinants across credit quality through the use of a copula function which disaggregates the behavior embedded in the marginal gamma distributions, so as to isolate the level of dependence which is captured in the copula function. The results suggest that the time varying joint correlation matrix performed far superior as compared to the constant correlation matrix; the centerpiece of linear regression models.
Resumo:
The content and context of work significantly influences an employees’ satisfaction. While managers see work motivation as a tool to engage the employees so that they perform better, academicians value work motivation for its contribution to human behaviour. Though the relationship between employee motivation and project success has been extensively covered in the literature, more research focusing on the nature of job design on project success may have been wanting. We address this gap through this study. The present study contributes to the extant literature by suggesting an operational framework of work motivation for project—based organizations. We are also advancing the conceptual understanding of this variable by understanding how the different facets of work motivation have a differing impact of the various parameters of project performance. A survey instrument using standardized scales of work motivation and project success was used. 199 project workers from various industries completed the survey. We first ‘operationalized’ the definition of work motivation for the purpose of our study through a principal component analysis of work motivation items. We obtained a five factor structure that had items pertaining to employee development, work climate, goal clarity, and job security. We then performed a Pearson’s correlation analysis which revealed moderate to significant relationship between project outcomes ad work climate; project outcomes & employee development. In order to establish a causality between work motivation and project management success, we employed linear regression analysis. The results show that work climate is a significant predictor of client satisfaction, while it moderately influences the project quality. Further, bringing in objectivity to project work is important for a successful implementation.
Resumo:
Forecasts of volatility and correlation are important inputs into many practical financial problems. Broadly speaking, there are two ways of generating forecasts of these variables. Firstly, time-series models apply a statistical weighting scheme to historical measurements of the variable of interest. The alternative methodology extracts forecasts from the market traded value of option contracts. An efficient options market should be able to produce superior forecasts as it utilises a larger information set of not only historical information but also the market equilibrium expectation of options market participants. While much research has been conducted into the relative merits of these approaches, this thesis extends the literature along several lines through three empirical studies. Firstly, it is demonstrated that there exist statistically significant benefits to taking the volatility risk premium into account for the implied volatility for the purposes of univariate volatility forecasting. Secondly, high-frequency option implied measures are shown to lead to superior forecasts of the intraday stochastic component of intraday volatility and that these then lead on to superior forecasts of intraday total volatility. Finally, the use of realised and option implied measures of equicorrelation are shown to dominate measures based on daily returns.
Resumo:
Purpose: Within the context of high global competitiveness, knowledge management (KM) has proven to be one of the major factors contributing to enhanced business outcomes. Furthermore, knowledge sharing (KS) is one of the most critical of all KM activities. From a manufacturing industry perspective, supply chain management (SCM) and product development process (PDP) activities, require a high proportion of company resources such as budget and manpower. Therefore, manufacturing companies are striving to strengthen SCM, PDP and KS activities in order to accelerate rates of manufacturing process improvement, ultimately resulting in higher levels of business performance (BP). A theoretical framework along with a number of hypotheses are proposed and empirically tested through correlation, factor and path analyses. Design/methodology/approach: A questionnaire survey was administered to a sample of electronic manufacturing companies operating in Taiwan to facilitate testing the proposed relationships. More than 170 respondents from 83 organisations responded to the survey. The study identified top management commitment and employee empowerment, supplier evaluation and selection, and design simplification and modular design as the key business activities that are strongly associated with the business performance. Findings: The empirical study supports that key manufacturing business activities (i.e., SCM, PDP, and KS) are positively associated with BP. The findings also evealed that some specific business activities such as SCMF1,PDPF2, and KSF1 have the strongest influencing power on particular business outcomes (i.e., BPF1 and BPF2) within the context of electronic manufacturing companies operating in Taiwan. Practical implications: The finding regarding the relationship between SCM and BP identified the essential role of supplier evaluation and selection in improving business competitiveness and long term performance. The process of forming knowledge in companies, such as creation, storage/retrieval, and transfer do not necessarily lead to enhanced business performance; only through effectively applying knowledge to the right person at the right time does. Originality/value: Based on this finding it is recommended that companies should involve suppliers in partnerships to continuously improve operations and enhance product design efforts, which would ultimately enhance business performance. Business performance depends more on an employee’s ability to turn knowledge into effective action.
Resumo:
Knowledge Management (KM) is a process that focuses on knowledge-related activities to facilitate knowledge creation, capture, transformation and use, with the ultimate aim of leveraging organisations’ intellectual capital to achieve organisational objectives. The KM process receives input from its context (e.g. internal business environment), and produces output (i.e. knowledge). It is argued that the validity of such knowledge should be justified by business performance. The study, this paper reports on, provides enhanced empirical understanding of such an input-process-output relationship through investigating the interactions among different KM activities in the context of how construction organisations in Hong Kong manage knowledge. To this end, a theoretical framework along with a number of hypotheses are proposed and empirically tested through correlation, regression and path analyses. A questionnaire survey was administered to a sample of construction contractors operating in Hong Kong to facilitate testing the proposed relationships. More than 140 respondents from 99 organisations responded to the survey. The study findings demonstrate that both organisational and technical environments have the potential to predict the intensity of KM activities. Furthermore, different categories of KM activities interact with each other, and collectively they could be used to predict business performance.
Resumo:
IEEE 802.11p is the new standard for Inter-Vehicular Communications (IVC) using the 5.9 GHz frequency band, as part of the DSRC framework; it will enable applications based on Cooperative Systems. Simulation is widely used to estimate or verify the potential benefits of such cooperative applications, notably in terms of safety for the drivers. We have developed a performance model for 802.11p that can be used by simulations of cooperative applications (e.g. collision avoidance) without requiring intricate models of the whole IVC stack. Instead, it provide a a straightforward yet realistic modelisation of IVC performance. Our model uses data from extensive field trials to infer the correlation between speed, distance and performance metrics such as maximum range, latency and frame loss. Then, we improve this model to limit the number of profiles that have to be generated when there are more than a few couples of emitter-receptor in a given location. Our model generates realistic performance for rural or suburban environments among small groups of IVC-equipped vehicles and road side units.
Resumo:
The method of generalized estimating equations (GEE) is a popular tool for analysing longitudinal (panel) data. Often, the covariates collected are time-dependent in nature, for example, age, relapse status, monthly income. When using GEE to analyse longitudinal data with time-dependent covariates, crucial assumptions about the covariates are necessary for valid inferences to be drawn. When those assumptions do not hold or cannot be verified, Pepe and Anderson (1994, Communications in Statistics, Simulations and Computation 23, 939–951) advocated using an independence working correlation assumption in the GEE model as a robust approach. However, using GEE with the independence correlation assumption may lead to significant efficiency loss (Fitzmaurice, 1995, Biometrics 51, 309–317). In this article, we propose a method that extracts additional information from the estimating equations that are excluded by the independence assumption. The method always includes the estimating equations under the independence assumption and the contribution from the remaining estimating equations is weighted according to the likelihood of each equation being a consistent estimating equation and the information it carries. We apply the method to a longitudinal study of the health of a group of Filipino children.
Resumo:
Representation of facial expressions using continuous dimensions has shown to be inherently more expressive and psychologically meaningful than using categorized emotions, and thus has gained increasing attention over recent years. Many sub-problems have arisen in this new field that remain only partially understood. A comparison of the regression performance of different texture and geometric features and investigation of the correlations between continuous dimensional axes and basic categorized emotions are two of these. This paper presents empirical studies addressing these problems, and it reports results from an evaluation of different methods for detecting spontaneous facial expressions within the arousal-valence dimensional space (AV). The evaluation compares the performance of texture features (SIFT, Gabor, LBP) against geometric features (FAP-based distances), and the fusion of the two. It also compares the prediction of arousal and valence, obtained using the best fusion method, to the corresponding ground truths. Spatial distribution, shift, similarity, and correlation are considered for the six basic categorized emotions (i.e. anger, disgust, fear, happiness, sadness, surprise). Using the NVIE database, results show that the fusion of LBP and FAP features performs the best. The results from the NVIE and FEEDTUM databases reveal novel findings about the correlations of arousal and valence dimensions to each of six basic emotion categories.
Resumo:
The function of a protein can be partially determined by the information contained in its amino acid sequence. It can be assumed that proteins with similar amino acid sequences normally have closer functions. Hence analysing the similarity of proteins has become one of the most important areas of protein study. In this work, a layered comparison method is used to analyze the similarity of proteins. It is based on the empirical mode decomposition (EMD) method, and protein sequences are characterized by the intrinsic mode functions (IMFs). The similarity of proteins is studied with a new cross-correlation formula. It seems that the EMD method can be used to detect the functional relationship of two proteins. This kind of similarity method is a complement of traditional sequence similarity approaches which focus on the alignment of amino acids