796 resultados para Empirical Algorithm Analysis
Resumo:
Mass transfer kinetics in osmotic dehydration is usually modeled by Fick's law, empirical models and probabilistic models. The aim of this study was to determine the applicability of Peleg model to investigate the mass transfer during osmotic dehydration of mackerel (Scomber japonicus) slices at different temperatures. Osmotic dehydration was performed on mackerel slices by cooking-infusion in solutions with glycerol and salt (a w = 0.64) at different temperatures: 50, 70, and 90 ºC. Peleg rate constant (K1) (h(g/gdm)-1) varied with temperature variation from 0.761 to 0.396 for water loss, from 5.260 to 2.947 for salt gain, and from 0.854 to 0.566 for glycerol intake. In all cases, it followed the Arrhenius relationship (R²>0.86). The Ea (kJ / mol) values obtained were 16.14; 14.21, and 10.12 for water, salt, and glycerol, respectively. The statistical parameters that qualify the goodness of fit (R²>0.91 and RMSE<0.086) indicate promising applicability of Peleg model.
Resumo:
The cellular structure of healthy food products, with added dietary fiber and low in calories, is an important factor that contributes to the assessment of quality, which can be quantified by image analysis of visual texture. This study seeks to compare image analysis techniques (binarization using Otsu’s method and the default ImageJ algorithm, a variation of the iterative intermeans method) for quantification of differences in the crumb structure of breads made with different percentages of whole-wheat flour and fat replacer, and discuss the behavior of the parameters number of cells, mean cell area, cell density, and circularity using response surface methodology. Comparative analysis of the results achieved with the Otsu and default ImageJ algorithms showed a significant difference between the studied parameters. The Otsu method demonstrated the crumb structure of the analyzed breads more reliably than the default ImageJ algorithm, and is thus the most suitable in terms of structural representation of the crumb texture.
Resumo:
Tämä diplomityö arvioi hitsauksen laadunhallintaohjelmistomarkkinoiden kilpailijoita. Kilpailukenttä on uusi ja ei ole tarkkaa tietoa siitä minkälaisia kilpailijoita on markkinoilla. Hitsauksen laadunhallintaohjelmisto auttaa yrityksiä takaamaan korkean laadun. Ohjelmisto takaa korkean laadun varmistamalla, että hitsaaja on pätevä, hän noudattaa hitsausohjeita ja annettuja parametreja. Sen lisäksi ohjelmisto kerää kaiken tiedon hitsausprosessista ja luo siitä vaadittavat dokumentit. Diplomityön teoriaosuus muodostuu kirjallisuuskatsauksesta ratkaisuliike-toimintaan, kilpailija-analyysin ja kilpailuvoimien teoriaan sekä hitsauksen laadunhallintaan. Työn empiriaosuus on laadullinen tutkimus, jossa tutkitaan kilpailevia hitsauksen laadunhallintaohjelmistoja ja haastatellaan ohjelmistojen käyttäjiä. Diplomityön tuloksena saadaan uusi kilpailija-analyysimalli hitsauksen laadunhallintaohjelmistoille. Mallin avulla voidaan arvostella ohjelmistot niiden tarjoamien primääri- ja sekundääriominaisuuksien perusteella. Toiseksi tässä diplomityössä analysoidaan nykyinen kilpailijatilanne hyödyntämällä juuri kehitettyä kilpailija-analyysimallia.
Resumo:
This work investigates theoretical properties of symmetric and anti-symmetric kernels. First chapters give an overview of the theory of kernels used in supervised machine learning. Central focus is on the regularized least squares algorithm, which is motivated as a problem of function reconstruction through an abstract inverse problem. Brief review of reproducing kernel Hilbert spaces shows how kernels define an implicit hypothesis space with multiple equivalent characterizations and how this space may be modified by incorporating prior knowledge. Mathematical results of the abstract inverse problem, in particular spectral properties, pseudoinverse and regularization are recollected and then specialized to kernels. Symmetric and anti-symmetric kernels are applied in relation learning problems which incorporate prior knowledge that the relation is symmetric or anti-symmetric, respectively. Theoretical properties of these kernels are proved in a draft this thesis is based on and comprehensively referenced here. These proofs show that these kernels can be guaranteed to learn only symmetric or anti-symmetric relations, and they can learn any relations relative to the original kernel modified to learn only symmetric or anti-symmetric parts. Further results prove spectral properties of these kernels, central result being a simple inequality for the the trace of the estimator, also called the effective dimension. This quantity is used in learning bounds to guarantee smaller variance.
Resumo:
Over time the demand for quantitative portfolio management has increased among financial institutions but there is still a lack of practical tools. In 2008 EDHEC Risk and Asset Management Research Centre conducted a survey of European investment practices. It revealed that the majority of asset or fund management companies, pension funds and institutional investors do not use more sophisticated models to compensate the flaws of the Markowitz mean-variance portfolio optimization. Furthermore, tactical asset allocation managers employ a variety of methods to estimate return and risk of assets, but also need sophisticated portfolio management models to outperform their benchmarks. Recent development in portfolio management suggests that new innovations are slowly gaining ground, but still need to be studied carefully. This thesis tries to provide a practical tactical asset allocation (TAA) application to the Black–Litterman (B–L) approach and unbiased evaluation of B–L models’ qualities. Mean-variance framework, issues related to asset allocation decisions and return forecasting are examined carefully to uncover issues effecting active portfolio management. European fixed income data is employed in an empirical study that tries to reveal whether a B–L model based TAA portfolio is able outperform its strategic benchmark. The tactical asset allocation utilizes Vector Autoregressive (VAR) model to create return forecasts from lagged values of asset classes as well as economic variables. Sample data (31.12.1999–31.12.2012) is divided into two. In-sample data is used for calibrating a strategic portfolio and the out-of-sample period is for testing the tactical portfolio against the strategic benchmark. Results show that B–L model based tactical asset allocation outperforms the benchmark portfolio in terms of risk-adjusted return and mean excess return. The VAR-model is able to pick up the change in investor sentiment and the B–L model adjusts portfolio weights in a controlled manner. TAA portfolio shows promise especially in moderately shifting allocation to more risky assets while market is turning bullish, but without overweighting investments with high beta. Based on findings in thesis, Black–Litterman model offers a good platform for active asset managers to quantify their views on investments and implement their strategies. B–L model shows potential and offers interesting research avenues. However, success of tactical asset allocation is still highly dependent on the quality of input estimates.
Resumo:
This work presents synopsis of efficient strategies used in power managements for achieving the most economical power and energy consumption in multicore systems, FPGA and NoC Platforms. In this work, a practical approach was taken, in an effort to validate the significance of the proposed Adaptive Power Management Algorithm (APMA), proposed for system developed, for this thesis project. This system comprise arithmetic and logic unit, up and down counters, adder, state machine and multiplexer. The essence of carrying this project firstly, is to develop a system that will be used for this power management project. Secondly, to perform area and power synopsis of the system on these various scalable technology platforms, UMC 90nm nanotechnology 1.2v, UMC 90nm nanotechnology 1.32v and UMC 0.18 μmNanotechnology 1.80v, in order to examine the difference in area and power consumption of the system on the platforms. Thirdly, to explore various strategies that can be used to reducing system’s power consumption and to propose an adaptive power management algorithm that can be used to reduce the power consumption of the system. The strategies introduced in this work comprise Dynamic Voltage Frequency Scaling (DVFS) and task parallelism. After the system development, it was run on FPGA board, basically NoC Platforms and on these various technology platforms UMC 90nm nanotechnology1.2v, UMC 90nm nanotechnology 1.32v and UMC180 nm nanotechnology 1.80v, the system synthesis was successfully accomplished, the simulated result analysis shows that the system meets all functional requirements, the power consumption and the area utilization were recorded and analyzed in chapter 7 of this work. This work extensively reviewed various strategies for managing power consumption which were quantitative research works by many researchers and companies, it's a mixture of study analysis and experimented lab works, it condensed and presents the whole basic concepts of power management strategy from quality technical papers.
Resumo:
This article presents an empirical analysis based on cross-country data concerned with two points regarding corruption: (i) its effects on income; and (ii) how to mitigate corruption. The findings can be highlighted in two points. Firstly the idea that corruption is intrinsically connected with income is confirmed. Secondly, the traditional argument that an increase in rule of law represents a good strategy in the fight against corruption is valid for developing countries. Furthermore, this study reveals that the search for increasing the human development index represents a rule of thumb for high levels of income and to control corruption.
Resumo:
Hankintojen johtamisen kirjallisuus korostaa tehokkaan hankinnan olevan käypä keino tehostaa organisaation tulosta kokonaisvaltaisesti. Myös kasvava tietoisuus erityisesti epäsuorista hankintamenetelmistä ja työkaluista toimivat kannustimina tälle tutkimukselle. Tämän Pro Gradu -tutkimuksen päätarkoituksena on rakentaa kokonaisvaltainen ymmärrys epäsuorasta hankinnasta sekä löytää keinoja sen tehostamiseksi. Tutkimuksen tavoitteena on selvittää, miten globaali, monikansal- linen organisaatio voi parantaa kannattavuuttaan epäsuorissa hankinnoissa, sekä mitkä tekijät hankintastrategiassa vaikuttavat siihen. Tutkimus toteutettiin yksittäisenä tapaustutkimuksena suuren globaalin, monikan- sallisen yrityksen työntekijän näkökulmasta, Pääosa datasta pohjautuu vuonna 2015 toteutettuun Opportunity -analyysi projektiin, joka toteutettiin yhteistyössä ulkoisen konsulttifirman kanssa. Osa datasta pohjautuu puolistrukturoituihin haas- tatteluihin organisaation hankintajohtajan kanssa. Datan keruussa hyödynnettiin lisäksi henkilökohtaista havainnointia ja sekundääristä aineistoa organisaatiosta. Tämä Pro Gradu tutkimus on toteutettu kvalitatiivisella otteella, sisältäen joitakin kvantitatiivisia metodin piirteitä.
Resumo:
The aim of this paper is to discuss the trend of overvaluation of the Brazilian currency in the 2000s, presenting an econometric model to estimate the real exchange rate (RER) and which should be a reference level of the RER to guide long-term economic policy. In the econometric model, we consider long-term structural and short-term components, both of which may be responsible for explaining overvaluation trend of the Brazilian currency. Our econometric exercise confirms that the Brazilian currency had been persistently overvalued throughout almost all of the period under analysis, and we suggest that the long-term reference level of the real exchange rate was reached in 2004. In July 2014, the average nominal exchange rate should have been around 2.90 Brazilian reais per dollar (against an observed nominal rate of 2.22 Brazilian reais per dollar) to achieve the 2004 real reference level (average of the year). That is, according to our estimates, in July 2014 the Brazilian real was overvalued at 30.6 per cent in real terms relative to the reference level. Based on these findings we conclude the paper suggesting a mix of policy instruments that should have been used in order to reverse the overvaluation trend of the Brazilian real exchange rate, including a target for reaching a real exchange rate in the medium and the long-run which would favor resource allocation toward more technological intensive sectors.
Resumo:
Emerging markets have experienced rapid economic growth, and manufacturing firms have had to face the effects of globalisation. Some of the major emerging economies have been able to create a supportive business environment that fosters innovation, and China is a good example of a country that has been able to increase value-added investments. Conversely, when we look at Russia, another big emerging market, we witness a situation in which domestic firms struggle more with global competitiveness. Innovation has proven to be one of the most essential ingredients for firms aiming to grow and become more competitive. In emerging markets, the business environment sets many constraints for innovation. However, open strategic choices in new product development enable companies in emerging markets to expand their resource base and capability building. Networking and close inter-firm cooperation are essential in this regard. In this dissertation, I argue that technology transfer is one of the key tools for these companies to become internationally networked and to improve their competitiveness. It forces companies to reach outside the company and national borders, which in many cases, is a major challenge for firms in emerging markets. This dissertation focuses on how companies can catch up with competitiveness in emerging markets. The empirical studies included in the dissertation are based on analyses of survey data mainly of firms and their strategies in the Russian manufacturing industry. The dissertation contributes to the current strategic management literature by further investigating technology management strategies in manufacturing firms in emerging markets and the benefits of more open approaches to new product development and innovation.
Resumo:
Confocal and two-photon microcopy have become essential tools in biological research and today many investigations are not possible without their help. The valuable advantage that these two techniques offer is the ability of optical sectioning. Optical sectioning makes it possible to obtain 3D visuahzation of the structiu-es, and hence, valuable information of the structural relationships, the geometrical, and the morphological aspects of the specimen. The achievable lateral and axial resolutions by confocal and two-photon microscopy, similar to other optical imaging systems, are both defined by the diffraction theorem. Any aberration and imperfection present during the imaging results in broadening of the calculated theoretical resolution, blurring, geometrical distortions in the acquired images that interfere with the analysis of the structures, and lower the collected fluorescence from the specimen. The aberrations may have different causes and they can be classified by their sources such as specimen-induced aberrations, optics-induced aberrations, illumination aberrations, and misalignment aberrations. This thesis presents an investigation and study of image enhancement. The goal of this thesis was approached in two different directions. Initially, we investigated the sources of the imperfections. We propose methods to eliminate or minimize aberrations introduced during the image acquisition by optimizing the acquisition conditions. The impact on the resolution as a result of using a coverslip the thickness of which is mismatched with the one that the objective lens is designed for was shown and a novel technique was introduced in order to define the proper value on the correction collar of the lens. The amoimt of spherical aberration with regard to t he numerical aperture of the objective lens was investigated and it was shown that, based on the purpose of our imaging tasks, different numerical apertures must be used. The deformed beam cross section of the single-photon excitation source was corrected and the enhancement of the resolution and image quaUty was shown. Furthermore, the dependency of the scattered light on the excitation wavelength was shown empirically. In the second part, we continued the study of the image enhancement process by deconvolution techniques. Although deconvolution algorithms are used widely to improve the quality of the images, how well a deconvolution algorithm responds highly depends on the point spread function (PSF) of the imaging system applied to the algorithm and the level of its accuracy. We investigated approaches that can be done in order to obtain more precise PSF. Novel methods to improve the pattern of the PSF and reduce the noise are proposed. Furthermore, multiple soiu'ces to extract the PSFs of the imaging system are introduced and the empirical deconvolution results by using each of these PSFs are compared together. The results confirm that a greater improvement attained by applying the in situ PSF during the deconvolution process.
Resumo:
The purpose of this study was to determine if there were differences between in-school and out-of-school day care centres. Five centres housed in public schools and five housed in other locations were selected for the research. Aquality assessment was administered in each centre which examined the following components - physical environment, adult social structure and socia-emotional environment, children's socia-emotional environment, cognitive stimulation program and toys and equipment. Quantitative analysis using simple t-tests showed a significant difference between in-school and out-of-school day cares for the physical environment variable. Differences approached significance for the children's socia-emotional environment variable as well as overall quality. Qualitative analysis using a triangulated methodology revealed noticeable differences for every variable. The researcher concluded that both the quality of the physical environment and the capabilities of the administrators strongly influence the quality of the day care environment. This study also included an assessment of children's attitude toward learning. No significant difference was found between in-school and out-of-school centres.
Resumo:
This thesis provides a conceptual analysis of research literature on teachers' ideology and literacy practices as well as a secondary analysis of three empirical studies and the ways in which the ideologies of the English as an Additional Language (EAL) (Street, 2005) teachers in these contexts impact the teaching of literacy in empowering/disabling ways. Several major theoretical components of Cummins (1996, 2000), Gee (1996, 2004) and Street (1995, 2001) are examined and integrated into a conceptual triad consisting of three main areas: power and ideology, validation of students ' cultural and linguistic backgrounds, and teaching that empowers. This triad provides the framework for the secondary analysis of three empirical studies on the ideologies of secondary EAL teachers. Implications of the findings from the conceptual and secondary analyses are examined in light of the research community and secondary school teachers of EAL.
Resumo:
This study examines the efficiency of search engine advertising strategies employed by firms. The research setting is the online retailing industry, which is characterized by extensive use of Web technologies and high competition for market share and profitability. For Internet retailers, search engines are increasingly serving as an information gateway for many decision-making tasks. In particular, Search engine advertising (SEA) has opened a new marketing channel for retailers to attract new customers and improve their performance. In addition to natural (organic) search marketing strategies, search engine advertisers compete for top advertisement slots provided by search brokers such as Google and Yahoo! through keyword auctions. The rationale being that greater visibility on a search engine during a keyword search will capture customers' interest in a business and its product or service offerings. Search engines account for most online activities today. Compared with the slow growth of traditional marketing channels, online search volumes continue to grow at a steady rate. According to the Search Engine Marketing Professional Organization, spending on search engine marketing by North American firms in 2008 was estimated at $13.5 billion. Despite the significant role SEA plays in Web retailing, scholarly research on the topic is limited. Prior studies in SEA have focused on search engine auction mechanism design. In contrast, research on the business value of SEA has been limited by the lack of empirical data on search advertising practices. Recent advances in search and retail technologies have created datarich environments that enable new research opportunities at the interface of marketing and information technology. This research uses extensive data from Web retailing and Google-based search advertising and evaluates Web retailers' use of resources, search advertising techniques, and other relevant factors that contribute to business performance across different metrics. The methods used include Data Envelopment Analysis (DEA), data mining, and multivariate statistics. This research contributes to empirical research by analyzing several Web retail firms in different industry sectors and product categories. One of the key findings is that the dynamics of sponsored search advertising vary between multi-channel and Web-only retailers. While the key performance metrics for multi-channel retailers include measures such as online sales, conversion rate (CR), c1ick-through-rate (CTR), and impressions, the key performance metrics for Web-only retailers focus on organic and sponsored ad ranks. These results provide a useful contribution to our organizational level understanding of search engine advertising strategies, both for multi-channel and Web-only retailers. These results also contribute to current knowledge in technology-driven marketing strategies and provide managers with a better understanding of sponsored search advertising and its impact on various performance metrics in Web retailing.
Resumo:
This paper analyzes the dynamics of wages and workers' mobility within firms with a hierarchical structure of job levels. The theoretical model proposed by Gibbons and Waldman (1999), that combines the notions of human capital accumulation, job rank assignments based on comparative advantage and learning about workers' abilities, is implemented empirically to measure the importance of these elements in explaining the wage policy of firms. Survey data from the GSOEP (German Socio-Economic Panel) are used to draw conclusions on the common features characterizing the wage policy of firms from a large sample of firms. The GSOEP survey also provides information on the worker's rank within his firm which is usually not available in other surveys. The results are consistent with non-random selection of workers onto the rungs of a job ladder. There is no direct evidence of learning about workers' unobserved abilities but the analysis reveals that unmeasured ability is an important factor driving wage dynamics. Finally, job rank effects remain significant even after controlling for measured and unmeasured characteristics.