814 resultados para swd: Benchmark
Resumo:
Tutkielman tavoitteena on selvittää, mitä eri tapoja majoitusliiketoiminnan järjestämiseksi on olemassa ja miten näitä liiketoimintamalleja voidaan ryhmitellä. Lisäksi selvitetään, mitkä ovat niitä tekijöitä, jotka vaikuttavat majoitusliiketoiminnan kannattavuuteen. Tutkimuksen toimeksiantajayritys suunnittelee majoitusliiketoiminnan perustamista kampusalueelle, joten mahdollisuuksia toiminnan perustamiseen arvioidaan teorian ja empirian pohjalta. Tutkimuksen teoreettisessa osiossa majoitusliikkeitä ryhmitellään niiden toiminnan ym-märtämiseksi. Teoriaosuudessa tutkitaan myös tuottojohtamista ja sen periaatteisiin pe-rustuvia menetelmiä, joita voidaan hyödyntää majoitusalalla. Tutkielman empiirinen osio toteutetaan kvalitatiivisena tutkimuksena. Se koostuu kahden potentiaalisen asiakkaan teemahaastatteluista, benchmark-analyysista ja siihen liittyvästä Helsingin yliopiston palvelupäällikön sähköpostihaastattelusta, vierailijamääriä koskevasta datasta sekä matkailualan tilastoista. Aineiston perusteella selvitetään, miten majoitusliiketoiminta kannattaisi järjestää kohdeyrityksen tapauksessa. Tutkimus osoitti majoitusliiketoiminnan moninaisuuden, sillä majoitusliikkeiden aineellisten ominaisuuksien, palvelukokonaisuuksien ja omistuspohjien yhdistelmiä on lukuisia. Majoitusliikkeen ominaisuuksien ryhmittely osoittautui hyödylliseksi keinoksi ymmärtää majoitusliikkeiden toimintaa. Ryhmittely nosti esiin tärkeitä seikkoja, jotka kohdeyrityksen on otettava huomioon majoitusliikeidean jatkokehittelyssä. Yksinkertaisia tuottojohtamisen menetelmiä voidaan hyödyntää myös kohdeyrityksen tapauksessa kannattavan liiketoiminnan varmistamiseksi.
Resumo:
The goal of this research – which is to critically analyze current theories and methods of intangible assets evaluation and potentially develop and test new methodology based on the practical example(s) in the IT industry. Having this goal in mind the main research questions in this paper will be: What are advantages and disadvantages of the current practices of measurement intellectual capital or valuation of intangible assets? How to properly measure intellectual capital in IT? Resulting method exhibits a new unique approach to the IC measurement and potentially even larger field of application. Despite the fact that in this particular research, I focused my attention on IT (Software and Internet services cluster – to be exact), the logic behind the method is applicable within any industry since the method is designed to be fully compliant with measurement theory and thus can be properly scaled for any application. Building a new method is a difficult and iterative process: in the current iteration the method stands out as rather a theoretical concept rather than a business tool, however even current concept totally fulfills its purpose as a benchmarking tool for measuring intellectual capital in IT industry.
Resumo:
The thesis examines the risk-adjusted performance of European small cap equity funds between 2008 and 2013. The performance is measured using several measures including Sharpe ratio, Treynor ratio, Modigliani measure, Jensen alpha, 3-factor alpha and 4-factor alpha. The thesis also addresses the issue of persistence in mutual fund performance. Thirdly, the relationship between the activity of fund managers and fund performance is investigated. The managerial activity is measured using tracking error and R-squared obtained from a 4-factor asset pricing model. The issues are investigated using Spearman rank correlation test, cross-sectional regression analysis and ranked portfolio tests. Monthly return data was provided by Morningstar and consists of 88 mutual funds. Results show that small cap funds earn back a significant amount of their expenses, but on average loose to their benchmark index. The evidence of performance persistence over 12-month time period is weak. Managerial activity is shown to positively contribute to fund performance
Resumo:
The power is still today an issue in wearable computing applications. The aim of the present paper is to raise awareness of the power consumption of wearable computing devices in specific scenarios to be able in the future to design energy efficient wireless sensors for context recognition in wearable computing applications. The approach is based on a hardware study. The objective of this paper is to analyze and compare the total power consumption of three representative wearable computing devices in realistic scenarios such as Display, Speaker, Camera and microphone, Transfer by Wi-Fi, Monitoring outdoor physical activity and Pedometer. A scenario based energy model is also developed. The Samsung Galaxy Nexus I9250 smartphone, the Vuzix M100 Smart Glasses and the SimValley Smartwatch AW-420.RX are the three devices representative of their form factors. The power consumption is measured using PowerTutor, an android energy profiler application with logging option and using unknown parameters so it is adjusted with the USB meter. The result shows that the screen size is the main parameter influencing the power consumption. The power consumption for an identical scenario varies depending on the wearable devices meaning that others components, parameters or processes might impact on the power consumption and further study is needed to explain these variations. This paper also shows that different inputs (touchscreen is more efficient than buttons controls) and outputs (speaker sensor is more efficient than display sensor) impact the energy consumption in different way. This paper gives recommendations to reduce the energy consumption in healthcare wearable computing application using the energy model.
Resumo:
Digitaalinen pelaaminen työhyvinvoinnin edistämisessä -hankkeessa tuotettiin kokonaiskuva digitaalisten pelien ja pelillisten sovellusten tunnetuista käyttötavoista työhyvinvoinnin edistämisessä sekä laadittiin arvio toiminnan tulevaisuusnäkymistä Suomessa. Hanke rakentui i) tutkimuskirjallisuuskatsauksesta, ii) kansainvälisten onnistuneiden tapausten benchmark-selvityksestä, iii) Suomen työikäisille suunnatusta kyselytutkimuksesta sekä iv) laadullisten asiantuntijahaastatteluiden sarjasta. Pelien hyötykäytöistä on julkaistu viime vuosina runsaasti tutkimusta, mutta erityinen työpaikkahyvinvointiin liittyvä tutkimus on vielä harvinaista. Kirjallisuuskatsauksen perusteella selvityksen näkökulmaksi muotoiltiin pelaamisen hyötyvaikutusten kolmijako. Pelaamisen primaareja hyötyjä painottavissa ratkaisuissa korostetaan pelaamisen viihdyttävyyden itsearvoisuutta. Sekundaaristen hyötyjen tapauksissa pelejä ja pelaamista hyödynnetään insentiivinä toivottuun käyttäytymiseen kuten oppimiseen tai omaehtoiseen terveyden edistämiseen. Tertiaaristen hyötyjen näkökulmassa pelit toimivat sosiaalisen vuorovaikutuksen yhteisöllisyyttä lisäävinä toimintaympäristöinä. Kartoitettujen kansainvälisen tapausten (N=62) aineisto osoitti, että digitaaliset pelipalvelut ovat erityisessä työhyvinvointikäytössä vielä epätavallisia, vaikka yleisempiä pelillistämispalveluita tarjoavia yrityksiä on huomattava määrä. Kartoitettujen palveluiden arvolupaus ja odotettu vaikuttavuus liittyi tyypillisesti pelaamisen tertiaarisiin hyötyihin eli työyhteisön kokonaisvaltaisen toimivuuden sekä sisäisen viestinnän parantamiseen. Sekundaarisia, työntekijän työtehokkuutta sekä peruskuntoa parantavia hyötyjä tavoittelevia tapauksia oli lähes yhtä paljon, mutta pelaamisen viihdyttävyyttä ja palkitsevaa pelikokemusta korostettiin vain harvoin. Suomen työikäisille toteutetun edustavan kyselytutkimuksen (N=1000) analyysi osoitti, että työpaikoilla pelaaminen on yleinen ja tavanomainen ilmiö. Noin 20 % Suomen työvoimasta pelaa digitaalisia pelejä työpaikalla vähintään toisinaan. Työntekijät tavoittelevat pelaamisella primaareja, rentoutumiseen ja työstä irtautumisen sisäisen motivaation vaikutuksia, minkä voidaan tulkita olevan ristiriidassa hyötypelien sekundaarisiin ja tertiaarisiin vaikutuksiin painottuvien välineellisten arvolupausten kanssa. Pelaamisen koetut myönteiset vaikutukset riippuivat vahvasti pelaamisen roolista vastaajan elämässä yleisesti. Siten pelit eivät ole yleispätevä, kaikille sopiva työkalu työtehoa potentiaalisesti lisäävänä palautumiskeinona. Asiantuntijahaastatteluiden analyysi osoitti, että pelien tarjontaa, hyötyjä tai mahdollisuuksia ei tunneta, mutta kiinnostusta palveluiden hankkimiselle on olemassa. Avaintekijöitä pelien hyödyntämiselle ovat ansaintamallien ja arvoketjujen kehittäminen, markkinoiden luominen sekä tuotekehitys ja laadunvarmistus. Työhyvinvointiin kehitettyjen digitaalisten pelien ja pelillisten sovellusten kenttä ei ole vielä muodostunut, mutta toiminnalla on merkittäviä kasvun edellytyksiä lähitulevaisuudessa. Hyvinvointia edistävissä peliratkaisuissa tulisi hyödyntää kaupallisissa viihdepeleissä kehitettyjä pelimekaniikkoja viihdyttävän ja motivoivan kokemuksen aikaansaamiseksi. Työn organisointiin voidaan soveltaa viihdepelisuunnittelun näkökulmia, mutta kasvava pelilähtöinen hyvinvointiliiketoiminnan kehittyminen edellyttää tieteidenvälistä yhteistoimintaa sekä panostusta useiden alojen toimijoilta.
Resumo:
Over time the demand for quantitative portfolio management has increased among financial institutions but there is still a lack of practical tools. In 2008 EDHEC Risk and Asset Management Research Centre conducted a survey of European investment practices. It revealed that the majority of asset or fund management companies, pension funds and institutional investors do not use more sophisticated models to compensate the flaws of the Markowitz mean-variance portfolio optimization. Furthermore, tactical asset allocation managers employ a variety of methods to estimate return and risk of assets, but also need sophisticated portfolio management models to outperform their benchmarks. Recent development in portfolio management suggests that new innovations are slowly gaining ground, but still need to be studied carefully. This thesis tries to provide a practical tactical asset allocation (TAA) application to the Black–Litterman (B–L) approach and unbiased evaluation of B–L models’ qualities. Mean-variance framework, issues related to asset allocation decisions and return forecasting are examined carefully to uncover issues effecting active portfolio management. European fixed income data is employed in an empirical study that tries to reveal whether a B–L model based TAA portfolio is able outperform its strategic benchmark. The tactical asset allocation utilizes Vector Autoregressive (VAR) model to create return forecasts from lagged values of asset classes as well as economic variables. Sample data (31.12.1999–31.12.2012) is divided into two. In-sample data is used for calibrating a strategic portfolio and the out-of-sample period is for testing the tactical portfolio against the strategic benchmark. Results show that B–L model based tactical asset allocation outperforms the benchmark portfolio in terms of risk-adjusted return and mean excess return. The VAR-model is able to pick up the change in investor sentiment and the B–L model adjusts portfolio weights in a controlled manner. TAA portfolio shows promise especially in moderately shifting allocation to more risky assets while market is turning bullish, but without overweighting investments with high beta. Based on findings in thesis, Black–Litterman model offers a good platform for active asset managers to quantify their views on investments and implement their strategies. B–L model shows potential and offers interesting research avenues. However, success of tactical asset allocation is still highly dependent on the quality of input estimates.
Resumo:
Lisäävä valmistus (additive manufacturing, AM) on nykyaikainen menetelmä teollisuuskäyttöön tarkoitettujen kappaleiden valmistukseen. Vaikka Suomessa on vahva tieto lisäävästä valmistuksesta ja tutkimusta tehdään aktiivisesti, on menetelmän käyttö teollisuudessa vielä vähäistä. Tämän diplomityön tarkoituksena on kerätä tietoa kiinnostuksesta lisäävään valmistukseen Kaakkois-Suomen alueen yrityksiltä, jotka tekniikkaa voisivat hyödyntää. Työn tarkoituksena on myös tutkia vaatimuksia käyttöönotolle ja käyttökohteiden mahdollisuuksia Kaakkois-Suomen alueella. Työn tuloksena voidaan todeta, että kaksi syytä, jotka hidastavat lisäävän valmistuksen käyttöönottoa teollisuudessa, ovat ennakkoluulot teknologiaa kohtaan sekä perustiedon puuttuminen. Median luoma 3D-innostus luo myös paljon väärinkäsityksiä. Kaikesta huolimatta Kaakkois-Suomen alueen paperiteollisuudesta lähtenyt korkeatasoinen teollisuus- ja konepajaosaaminen pystyy toimimaan perustana teknologian käyttöönotolle.
Resumo:
This master thesis presents a study on the requisite cooling of an activated sludge process in paper and pulp industry. The energy consumption of paper and pulp industry and it’s wastewater treatment plant in particular is relatively high. It is therefore useful to understand the wastewater treatment process of such industries. The activated sludge process is a biological mechanism which degrades carbonaceous compounds that are present in waste. The modified activated sludge model constructed here aims to imitate the bio-kinetics of an activated sludge process. However, due to the complicated non-linear behavior of the biological process, modelling this system is laborious and intriguing. We attempt to find a system solution first using steady-state modelling of Activated Sludge Model number 1 (ASM1), approached by Euler’s method and an ordinary differential equation solver. Furthermore, an enthalpy study of paper and pulp industry’s vital pollutants was carried out and applied to revise the temperature shift over a period of time to formulate the operation of cooling water. This finding will lead to a forecast of the plant process execution in a cost-effective manner and management of effluent efficiency. The final stage of the thesis was achieved by optimizing the steady state of ASM1.
Resumo:
Mesoporous metal oxides are nowadays widely used in various technological applications, for instance in catalysis, biomolecular separations and drug delivery. A popular technique used to synthesize mesoporous metal oxides is the nanocasting process. Mesoporous metal oxide replicas are obtained from the impregnation of a porous template with a metal oxide precursor followed by thermal treatment and removal of the template by etching in NaOH or HF solutions. In a similar manner to the traditional casting wherein the product inherits the features of the mold, the metal oxide replicas are supposed to have an inverse structure of the starting porous template. This is however not the case, as broken or deformed particles and other structural defects have all been experienced during nanocasting experiments. Although the nanocasting technique is widely used, not all the processing steps are well understood. Questions over the fidelity of replication and morphology control are yet to be adequately answered. This work therefore attempts to answer some of these questions by elucidating the nanocasting process, pin pointing the crucial steps involved and how to harness this knowledge in making wholesome replicas which are a true replication of the starting templates. The rich surface chemistry of mesoporous metal oxides is an important reason why they are widely used in applications such as catalysis, biomolecular separation, etc. At times the surface is modified or functionalized with organic species for stability or for a particular application. In this work, nanocast metal oxides (TiO2, ZrO2 and SnO2) and SiO2 were modified with amino-containing molecules using four different approaches, namely (a) covalent bonding of 3-aminopropyltriethoxysilane (APTES), (b) adsorption of 2-aminoethyl dihydrogen phosphate (AEDP), (c) surface polymerization of aziridine and (d) adsorption of poly(ethylenimine) (PEI) through electrostatic interactions. Afterwards, the hydrolytic stability of each functionalization was investigated at pH 2 and 10 by zeta potential measurements. The modifications were successful except for the AEDP approach which was unable to produce efficient amino-modification on any of the metal oxides used. The APTES, aziridine and PEI amino-modifications were fairly stable at pH 10 for all the metal oxides tested while only AZ and PEI modified-SnO2 were stable at pH 2 after 40 h. Furthermore, the functionalized metal oxides (SiO2, Mn2O3, ZrO2 and SnO2) were packed into columns for capillary liquid chromatography (CLC) and capillary electrochromatography (CEC). Among the functionalized metal oxides, aziridinefunctionalized SiO2, (SiO2-AZ) showed good chemical stability, and was the most useful packing material in both CLC and CEC. Lastly, nanocast metal oxides were synthesized for phosphopeptide enrichment which is a technique used to enrich phosphorylated proteins in biological samples prior to mass spectrometry analysis. By using the nanocasting technique to prepare the metal oxides, the surface area was controlled within a range of 42-75 m2/g thereby enabling an objective comparison of the metal oxides. The binding characteristics of these metal oxides were compared by using samples with different levels of complexity such as synthetic peptides and cell lysates. The results show that nanocast TiO2, ZrO2, Fe2O3 and In2O3 have comparable binding characteristics. Furthermore, In2O3 which is a novel material in phosphopeptide enrichment applications performed comparably with standard TiO2 which is the benchmark for such phosphopeptide enrichment procedures. The performance of the metal oxides was explained by ranking the metal oxides according to their isoelectric points and acidity. Overall, the clarification of the nanocasting process provided in this work will aid the synthesis of metal oxides with true fidelity of replication. Also, the different applications of the metal oxides based on their surface interactions and binding characteristics show the versatility of metal oxide materials. Some of these results can form the basis from which further applications and protocols can be developed.
Resumo:
Establishing of export operations is the key to the competitiveness for all producing companies in high-tech industry. Distribution partnerships between exporting producer and local distributors of relevant foreign market are utilized by SMEs to gain cost-efficiency of operation. The purpose of this study was to investigate the Swiss market of outdoor lighting solutions and propose distribution channels for the case of company C2 SmartLight Ltd. The literature framework consists of three main parts: description of distribution channels for business products, the selection process of the distributor and management of the distributors. The empirical part of this study composed of the observation of Swiss lighting market, highlighting key customers, trends of energy efficiency and key industry players of the lighting market. The aim was to identify potential distribution channels, which reach the target customer groups and identify the market opportunity. Secondly, the data was collected through semi-structured phone interviews. The company, which operates in outdoor lighting business and has an established distributor in Switzerland, was interviewed and used as a benchmark. As a result of this research the market opportunity for distribution of C2 SmartLight products was identified based on potential customers and market need. C2 SmartLight Ltd. should establish a connection with wholesalers that distribute easy to handle and store electrical equipment. The results of this study can be used by other SME companies, operating in a similar field of economy, for selection of distributors.
Resumo:
The construction of adenovirus vectors for cloning and foreign gene expression requires packaging cell lines that can complement missing viral functions caused by sequence deletions and/or replacement with foreign DNA sequences. In this study, packaging cell lines were designed to provide in trans the missing bovine adenovirus functions, so that recombinant viruses could be generated. Fetal bovine kidney and lUng cells, acquired at the trimester term from a pregnant cow, were tranfected with both digested wild type BAV2 genomic DNA and pCMV-EI. The plasmid pCMV-EI was specifically constructed to express El of BAV2 under the control of the cytomegalovirus enhancer/promoter (CMV). Selection for "true" transformants by continuous passaging showed no success in isolating immortalised cells, since the cells underwent crisis resulting in complete cell death. Moreover, selection for G418 resistance, using the same cells, also did not result in the isolation of an immortalised cell line and the same culture-collapse event was observed. The lack of success in establishing an immortalised cell line from fetal tissue prompted us to transfect a pre-established cell line. We began by transfecting MDBK (Mardin-Dardy bovine kidney) cells with pCMV-El-neo, which contain the bacterial selectable marker neo gene. A series of MDBK-derived cell lines, that constitutively express bovine adenoviral (BAV) early region 1 (El), were then isolated. Cells selected for resistance to the drug G418 were isolated collectively for full characterisation to assess their suitability as packaging cell lines. Individual colonies were isolated by limiting dilution and further tested for El expression and efficiency of DNA uptake. Two cell lines, L-23 and L-24, out of 48 generated foci tested positive for £1 expression using Northern Blot analysis. DNA uptake studies, using both lipofectamine and calcium phosphate methods, were performed to compare these cells, their parental MDBK cells, 8 and the unrelated human 293 cells as a benchmark. The results revealed that the new MDBKderived clones were no more efficient than MDBK cells in the transient expression of transfected DNA and that they were inferior to 293 cells, when using lacZ as the reporter gene. In view of the inherently poor transfection efficiency of MDBK cells and their derivatives, a number of other bovine cells were investigated for their potential as packaging cells. The cell line CCL40 was chosen for its high efficiency in DNA uptake and subsequently transfected with the plasmid vector pCMV El-neo. By selection with the drug G418, two cell lines were isolated, ProCell 1 and ProCell 2. These cell lines were tested for El expression, permissivity to BAV2 and DNA uptake efficiency, revealing a DNA uptake efficiency of 37 % , comparable to that of CCL40. Attempts to rescue BAV2 mutants carrying the lacZ gene in place of £1 or £3 were carried out by co-transfecting wild type viral DNA with either the plasmid pdlElE-Z (which contains BAV2 sequences from 0% to 40.4% with the lacZ gene in place of the £1 region from 1.1% to 8.25%) or with the plasmid pdlE3-5-Z (which contains BAV2 sequences from 64.8% to 100% with the lacZ gene in place of the E3 region from 75.8% to 81.4%). These cotransfections did not result in the generation of a viral mutant. The lack of mutant generation was thought to be caused by the relative inefficiency ofDNA uptake. Consequently, cosBAV2, a cosmid vector carrying the BAV2 genome, was modified to carry the neo reporter gene in place of the £3 region from 75.8% to 81.4%. The use of a single cosmid vector earring the whole genome would eliminate the need for homologous recombination in order to generate a viral vector. Unfortunately, the transfection of cosBAV2- neo also did not result in the generation of a viral mutant. This may have been caused by the size of the £3 deletion, where excess sequences that are essential to the virus' survival might have been deleted. As an extension to this study, the spontaneous E3 deletion, accidently discovered in our viral stock, could be used as site of foreign gene insertion.
Resumo:
In 2007, Barry Bonds hit his 75 6th home run, breaking Hank Aaron's all-time record for most home runs in a Major League career. While it would be expected that such an accomplishment would induce unending praise and adulationfor the new record-holder, Bonds did not receive the treatment typically reserved for a beloved baseball hero. The purpose of this thesis is to assess media representations of the 2007 home run chase in order to shed light upon the factors which led to the mixed representations which accompanied BOlTds ' assault on Aaron's record. Drawingfrom Roland Barthes ' concept of myth, this thesis proposes that Bonds was portrayed in predominantly negative ways because he was seen as failing to embody the values of baseball's mythology. Using a qualitative content analysis of three major American newspapers, this thesis examines portrayals of Bonds and how he was shown both to represent and oppose elements from baseball's mythology, such as youth, and a distant, agrarian past. Recognizing the ways in which baseball is associated with American life, the media representations of Bonds are also evaluated to discern whether he was portrayed as personifYing a distinctly American set of values. The results indicate that, in media coverage of the 2007 home run chase, Bonds was depicted as a player of many contradictions. Most commonly, Bonds' athletic ability and career achievements were contrasted with unflattering descriptions of his character, including discussions of his alleged use of performance-enhancing substances. However, some coverage portrayed Bonds as embodying baseball myth. The findings contribute to an appreciation of the importance of historical context in examining media representations. This understanding is enhanced by an analysis of a selection of articles on Mark McGwire 's record-breaking season in 1998, and careful consideration of, and comparison to, the context under which Bonds performed in 2007. Findings are also shown to support the contemporary existence of a strong American baseball mythology. That Bonds is both condemned for failing to uphold the mythology and praised for personifYing it suggests that the values seen as inherent to baseball continue to act as an American cultural benchmark.
Resumo:
The present thesis examines the determinants of the bankruptcy protection duration for Canadian firms. Using a sample of Canadian firms that filed for bankruptcy protection between the calendar years 1992 and 2009, we fmd that the firm age, the industry adjusted operating margin, the default spread, the industrial production growth rate or the interest rate are influential factors on determining the length of the protection period. Older firms tend to stay longer under protection from creditors. As older firms have more complicated structures and issues to settle, the risk of exiting soon the protection (the hazard rate) is small. We also find that firms that perform better than their benchmark as measured by the industry they belong to, tend to leave quickly the bankruptcy protection state. We conclude that the fate of relatively successful companies is determined faster. Moreover, we report that it takes less time to achieve a final solution to firms under bankrupt~y when the default spread is low or when the appetite for risk is high. Conversely, during periods of high default spreads and flight for quality, it takes longer time to resolve the bankruptcy issue. This last finding may suggest that troubled firms should place themselves under protection when spreads are low. However, this ignores the endogeneity issue: high default spread may cause and incidentally reflect higher bankruptcy rates in the economy. Indeed, we find that bankruptcy protection is longer during economic downturns. We explain this relation by the natural increase in default rate among firms (and individuals) during economically troubled times. Default spreads are usually larger during these harsh periods as investors become more risk averse since their wealth shrinks. Using a Log-logistic hazard model, we also fmd that firms that file under the Companies' Creditors Arrangement Act (CCAA) protection spend longer time restructuring than firms that filed under the Bankruptcy and Insolvency Act (BIA). As BIA is more statutory and less flexible, solutions can be reached faster by court orders.
Resumo:
The main focus of this thesis is to evaluate and compare Hyperbalilearning algorithm (HBL) to other learning algorithms. In this work HBL is compared to feed forward artificial neural networks using back propagation learning, K-nearest neighbor and 103 algorithms. In order to evaluate the similarity of these algorithms, we carried out three experiments using nine benchmark data sets from UCI machine learning repository. The first experiment compares HBL to other algorithms when sample size of dataset is changing. The second experiment compares HBL to other algorithms when dimensionality of data changes. The last experiment compares HBL to other algorithms according to the level of agreement to data target values. Our observations in general showed, considering classification accuracy as a measure, HBL is performing as good as most ANn variants. Additionally, we also deduced that HBL.:s classification accuracy outperforms 103's and K-nearest neighbour's for the selected data sets.
Resumo:
Complex networks can arise naturally and spontaneously from all things that act as a part of a larger system. From the patterns of socialization between people to the way biological systems organize themselves, complex networks are ubiquitous, but are currently poorly understood. A number of algorithms, designed by humans, have been proposed to describe the organizational behaviour of real-world networks. Consequently, breakthroughs in genetics, medicine, epidemiology, neuroscience, telecommunications and the social sciences have recently resulted. The algorithms, called graph models, represent significant human effort. Deriving accurate graph models is non-trivial, time-intensive, challenging and may only yield useful results for very specific phenomena. An automated approach can greatly reduce the human effort required and if effective, provide a valuable tool for understanding the large decentralized systems of interrelated things around us. To the best of the author's knowledge this thesis proposes the first method for the automatic inference of graph models for complex networks with varied properties, with and without community structure. Furthermore, to the best of the author's knowledge it is the first application of genetic programming for the automatic inference of graph models. The system and methodology was tested against benchmark data, and was shown to be capable of reproducing close approximations to well-known algorithms designed by humans. Furthermore, when used to infer a model for real biological data the resulting model was more representative than models currently used in the literature.