913 resultados para Data-driven analysis


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tässä diplomityössä oli tavoitteena suunnitella ja toteuttaa verkkoliiketoiminnan tehokkuusmittauksen ohjausvaikutusten analysointijärjestelmä. Verkkoliiketoiminta on monopoliasemassa olevaa liiketoimintaa, jossa ei ole kilpailusta johtuvaa pakotetta pitää liiketoimintaa tehokkaana ja hintoja alhaisina. Tämän vuoksi verkkoliiketoiminnan hinnoittelua ja toiminnan tehokkuutta tulee valvoa viranomaisen toimesta. Tehokkuusmittauksessa käytettäväksi menetelmäksi on valittu DEA-menetelmä (Data Envelopment Analysis). Tässä työssä on esitelty DEA-menetelmän teoreettiset perusteet sekä verkkoliiketoiminnan tehokkuusmittauksessa havaitut ongelmat. Näiden perusteella on määritelty analysointijärjestelmältä vaadittavat ominaisuudet sekä kehitetty kyseinen järjestelmä. Tärkeimmiksi järjestelmän ominaisuuksiksi osoittautuivat herkkyysanalyysin tekeminen ja etenkin sitä kautta suoritettava keskeytysten hinnan laskeminen sekä mahdollisuudet painokertoimien rajoittamiselle. Työn loppuosassa on esitelty järjestelmästä saatavia konkreettisia tuloksia, joiden avulla on pyritty havainnollistamaan järjestelmän käyttömahdollisuuksia.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose The purpose of this paper is (1) to measure school technical efficiency and (2) to identify the determinants of primary school performance. Design/methodology/approach A two-stage Data Envelopment Analysis (DEA) of school efficiency is conducted. At the first stage, DEA is employed to calculate an individual efficiency score for each school. At the second stage, efficiency is regressed on school characteristics and environmental variables. Findings The mean technical efficiency of schools in the State of Geneva is equal to 93%. By improving the operation of schools, 7% (100 - 93) of inputs could be saved, representing 17'744'656.2 Swiss francs in 2010. School efficiency is negatively influenced by (1) operations being held on multiple sites, (2) the proportion of disadvantaged pupils enrolled at the school and (3) the provision of special education, but positively influenced by school size (captured by the number of pupils). Practical implications Technically, the determinants of school efficiency are outside of the control of the headteachers. However, it is still possible to either boost the positive impact or curb the negative impact. Potential actions are discussed. Originality/value Unlike most similar studies, the model in this study is tested for multicollinearity, heteroskedasticity and endogeneity. It is therefore robust. Moreover, one explanatory variable of school efficiency (operations being held on multiple sites) is a truly original variable as it has never been tested so far.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Interactions between stimuli's acoustic features and experience-based internal models of the environment enable listeners to compensate for the disruptions in auditory streams that are regularly encountered in noisy environments. However, whether auditory gaps are filled in predictively or restored a posteriori remains unclear. The current lack of positive statistical evidence that internal models can actually shape brain activity as would real sounds precludes accepting predictive accounts of filling-in phenomenon. We investigated the neurophysiological effects of internal models by testing whether single-trial electrophysiological responses to omitted sounds in a rule-based sequence of tones with varying pitch could be decoded from the responses to real sounds and by analyzing the ERPs to the omissions with data-driven electrical neuroimaging methods. The decoding of the brain responses to different expected, but omitted, tones in both passive and active listening conditions was above chance based on the responses to the real sound in active listening conditions. Topographic ERP analyses and electrical source estimations revealed that, in the absence of any stimulation, experience-based internal models elicit an electrophysiological activity different from noise and that the temporal dynamics of this activity depend on attention. We further found that the expected change in pitch direction of omitted tones modulated the activity of left posterior temporal areas 140-200 msec after the onset of omissions. Collectively, our results indicate that, even in the absence of any stimulation, internal models modulate brain activity as do real sounds, indicating that auditory filling in can be accounted for by predictive activity.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The human auditory cortex comprises the supratemporal plane and large parts of the temporal and parietal convexities. We have investigated the relevant intrahemispheric cortico-cortical connections using in vivo DSI tractography combined with landmark-based registration, automatic cortical parcellation and whole-brain structural connection matrices in 20 right-handed male subjects. On the supratemporal plane, the pattern of connectivity was related to the architectonically defined early-stage auditory areas. It revealed a three-tier architecture characterized by a cascade of connections from the primary auditory cortex to six adjacent non-primary areas and from there to the superior temporal gyrus. Graph theory-driven analysis confirmed the cascade-like connectivity pattern and demonstrated a strong degree of segregation and hierarchy within early-stage auditory areas. Putative higher-order areas on the temporal and parietal convexities had more widely spread local connectivity and long-range connections with the prefrontal cortex; analysis of optimal community structure revealed five distinct modules in each hemisphere. The pattern of temporo-parieto-frontal connectivity was partially asymmetrical. In conclusion, the human early-stage auditory cortical connectivity, as revealed by in vivo DSI tractography, has strong similarities with that of non-human primates. The modular architecture and hemispheric asymmetry in higher-order regions is compatible with segregated processing streams and lateralization of cognitive functions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A multivariate curve resolution method, "GENERALIZED RANK ANNIHILATION METHOD (GRAM)", is discussed and tested with simulated and experimental data. The analysis of simulated data provides general guidelines concerning the condition for uniqueness of a solution for a given problem. The second-order emission-excitation spectra of human and animal dental calculus deposits were used as an experimental data to estimate the performance of the above method. Three porphyrinic spectral profiles, for both human and cat, were obtained by the use of GRAM.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We report a structural study on polycrystalline La0.86Sr0.14Mn1-y Cu yO3+delta samples (y = 0, 0.05, 0.10, 0.15, 0.20) using refinement of powder X-ray diffraction data and analysis of scanning electron microscopy images. It is found that the structure remains rhombohedral through the whole series, with a decrease in the average Mn-Mn bond distances, slight variations in Mn-O-Mn angle and reduction in the unit cell volume with increasing Cu amounts. The values of Mn-Mn distances suggest compact structures with d within ±1%. Scanning electron microscopy images reveal homogeneous microstructure in all samples, besides a trend for smaller grains and larger porosity with increasing Cu content.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In a continuing investigation for potentially bioactive natural products, flavonoids were isolated from Lonchocarpus araripensis (Leguminoseae) and identified as 3-methoxy-6-O-prenyl-6'',6''-dimethylchromene-[7,8,2'',3'']-flavone (1), 3,6-dimethoxy-6'',6''-dimethylchromene-[7,8,2'',3'']-flavone (2) and 3,5,8-trimethoxy-[6,7,2",3"]-furanoflavone (3). This is the first time compound 3 has been described. Compound 2 has been previously isolated from roots while this is the first time 1 is reported in this species. Complete NMR assignments are given for1 ,2 and 3 together with the determination of conformation for 1.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Robotic grasping has been studied increasingly for a few decades. While progress has been made in this field, robotic hands are still nowhere near the capability of human hands. However, in the past few years, the increase in computational power and the availability of commercial tactile sensors have made it easier to develop techniques that exploit the feedback from the hand itself, the sense of touch. The focus of this thesis lies in the use of this sense. The work described in this thesis focuses on robotic grasping from two different viewpoints: robotic systems and data-driven grasping. The robotic systems viewpoint describes a complete architecture for the act of grasping and, to a lesser extent, more general manipulation. Two central claims that the architecture was designed for are hardware independence and the use of sensors during grasping. These properties enables the use of multiple different robotic platforms within the architecture. Secondly, new data-driven methods are proposed that can be incorporated into the grasping process. The first of these methods is a novel way of learning grasp stability from the tactile and haptic feedback of the hand instead of analytically solving the stability from a set of known contacts between the hand and the object. By learning from the data directly, there is no need to know the properties of the hand, such as kinematics, enabling the method to be utilized with complex hands. The second novel method, probabilistic grasping, combines the fields of tactile exploration and grasp planning. By employing well-known statistical methods and pre-existing knowledge of an object, object properties, such as pose, can be inferred with related uncertainty. This uncertainty is utilized by a grasp planning process which plans for stable grasps under the inferred uncertainty.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tässä pro gradu –tutkielmassa perehdyttiin globaalin telekommunikaatiosektorin allianssitoimintaan vuosina 2000-2010. Tutkimuksen tavoitteena oli tarkastella kvantitatiivisin menetelmin yrityskohtaisen ja makrotaloudellisen epävarmuuden vaikutusta solmittujen allianssien rakenteeseen, muotoon ja osapuolten maantieteelliseen sijaintiin. Lisäksi oli tarkoitus tutkia, kuinka allianssien vuosittainen määrä ja niihin osallistuvien yritysten määrä muuttuu epävarmuuden vaihtelujen myötä. Tutkielman empiirisen rungon muodosti sekundaarinen data SDC Platinum ja Thomson Datastream –tietokannoista. Lopulliseen aineistoon sisältyi 50 maailman suurinta telekommunikaatioyritystä useasta eri maasta. Tilastollinen analyysi suoritettiin logistisen ja paneelidataregression avulla. Tutkielman viidestä hypoteesista vain kaksi vahvistuivat osittain. Kyseiset hypoteesit olettivat epävarmuuden kasvun negatiivista vaikutusta vertikaalisten ja kotimaisten allianssien suosioon yrityksen silmissä. Muut regressiomallit tuottivat ristiriitaisia ja tilastollisesti ei-merkitseviä tuloksia.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tämän diplomityön päätavoitteena on tutkia voidaanko kunnille löytää optimaalinen kokoluokka Suomessa. Työn muita tavoitteita on tutkia voidaanko vastaava optimikoko löytää myös kahdelle kunnan tehtävistä, joiksi tässä tutkimuksessa on valittu sosiaali- ja terveystoimi sekä perusopetus. Lisäksi työssä tutkitaan miten paljon maakuntien välillä on tehokkuuseroja. Tutkimusmenetelminä on käytetty laskelmia, case -analyysiä, tilinpäätösanalyysiä ja tilastollisia menetelmiä. Työssä tehokkuutta mitattiin neljällä osa-alueella, jotka olivat tehokkuus toimintakatteen suhteen, tehokkuus toimintakustannusten suhteen, sosiaali- ja terveystoimen tehokkuus sekä perusopetuksen tehokkuus. Optimikoon määrityksien yhteydessä saatuihin tehokkuuslukuihin yhdistettiin kuntakohtaiset toimintaympäristötekijät kuten työttömyys ja väestön ikäjakauma. Työn lopputulos oli, että kunnan optimikoko Suomessa olisi noin 35 tuhatta asukasta, mutta erot välillä 25-40 tuhatta asukasta olivat pienet. Kustannustietojen perusteella kaikkein tehottomin kuntakokoluokka oli alle 3 tuhannen asukkaan kuntakoko, mutta toimintaympäristötekijöiden huomioimisen jälkeen kaikkein tehottomimmaksi osoittautui suurimmat yli 120 tuhannen asukkaan kunnat.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of this thesis is to examine whether the pricing anomalies exists in the Finnish stock markets by comparing the performance of quantile portfolios that are formed on the basis of either individual valuation ratios, composite value measures or combined value and momentum indicators. All the research papers included in the thesis show evidence of value anomalies in the Finnish stock markets. In the first paper, the sample of stocks over the 1991-2006 period is divided into quintile portfolios based on four individual valuation ratios (i.e., E/P, EBITDA/EV, B/P, and S/P) and three hybrids of them (i.e. composite value measures). The results show the superiority of composite value measures as selection criterion for value stocks, particularly when EBITDA/EV is employed as earnings multiple. The main focus of the second paper is on the impact of the holding period length on performance of value strategies. As an extension to the first paper, two more individual ratios (i.e. CF/P and D/P) are included in the comparative analysis. The sample of stocks over 1993- 2008 period is divided into tercile portfolios based on six individual valuation ratios and three hybrids of them. The use of either dividend yield criterion or one of three composite value measures being examined results in best value portfolio performance according to all performance metrics used. Parallel to the findings of many international studies, our results from performance comparisons indicate that for the sample data employed, the yearly reformation of portfolios is not necessarily optimal in order to maximally gain from the value premium. Instead, the value investor may extend his holding period up to 5 years without any decrease in long-term portfolio performance. The same holds also for the results of the third paper that examines the applicability of data envelopment analysis (DEA) method in discriminating the undervalued stocks from overvalued ones. The fourth paper examines the added value of combining price momentum with various value strategies. Taking account of the price momentum improves the performance of value portfolios in most cases. The performance improvement is greatest for value portfolios that are formed on the basis of the 3-composite value measure which consists of D/P, B/P and EBITDA/EV ratios. The risk-adjusted performance can be enhanced further by following 130/30 long-short strategy in which the long position of value winner stocks is leveraged by 30 percentages while simultaneously selling short glamour loser stocks by the same amount. Average return of the long-short position proved to be more than double stock market average coupled with the volatility decrease. The fifth paper offers a new approach to combine value and momentum indicators into a single portfolio-formation criterion using different variants of DEA models. The results throughout the 1994-2010 sample period shows that the top-tercile portfolios outperform both the market portfolio and the corresponding bottom-tercile portfolios. In addition, the middle-tercile portfolios also outperform the comparable bottom-tercile portfolios when DEA models are used as a basis for stock classification criteria. To my knowledge, such strong performance differences have not been reported in earlier peer-reviewed studies that have employed the comparable quantile approach of dividing stocks into portfolios. Consistently with the previous literature, the division of the full sample period into bullish and bearish periods reveals that the top-quantile DEA portfolios lose far less of their value during the bearish conditions than do the corresponding bottom portfolios. The sixth paper extends the sample period employed in the fourth paper by one year (i.e. 1993- 2009) covering also the first years of the recent financial crisis. It contributes to the fourth paper by examining the impact of the stock market conditions on the main results. Consistently with the fifth paper, value portfolios lose much less of their value during bearish conditions than do stocks on average. The inclusion of a momentum criterion somewhat adds value to an investor during bullish conditions, but this added value turns to negative during bearish conditions. During bear market periods some of the value loser portfolios perform even better than their value winner counterparts. Furthermore, the results show that the recent financial crisis has reduced the added value of using combinations of momentum and value indicators as portfolio formation criteria. However, since the stock markets have historically been bullish more often than bearish, the combination of the value and momentum criteria has paid off to the investor despite the fact that its added value during bearish periods is negative, on an average.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As technology has developed it has increased the number of data produced and collected from business environment. Over 80% of that data includes some sort of reference to geographical location. Individuals have used that information by utilizing Google Maps or different GPS devices, however such information has remained unexploited in business. This thesis will study the use and utilization of geographically referenced data in capital-intensive business by first providing theoretical insight into how data and data-driven management enables and enhances the business and how especially geographically referenced data adds value to the company and then examining empirical case evidence how geographical information can truly be exploited in capital-intensive business and what are the value adding elements of geographical information to the business. The study contains semi-structured interviews that are used to scan attitudes and beliefs of an organization towards the geographic information and to discover fields of applications for the use of geographic information system within the case company. Additionally geographical data is tested in order to illustrate how the data could be used in practice. Finally the outcome of the thesis provides understanding from which elements the added value of geographical information in business is consisted of and how such data can be utilized in the case company and in capital-intensive business.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Human activity recognition in everyday environments is a critical, but challenging task in Ambient Intelligence applications to achieve proper Ambient Assisted Living, and key challenges still remain to be dealt with to realize robust methods. One of the major limitations of the Ambient Intelligence systems today is the lack of semantic models of those activities on the environment, so that the system can recognize the speci c activity being performed by the user(s) and act accordingly. In this context, this thesis addresses the general problem of knowledge representation in Smart Spaces. The main objective is to develop knowledge-based models, equipped with semantics to learn, infer and monitor human behaviours in Smart Spaces. Moreover, it is easy to recognize that some aspects of this problem have a high degree of uncertainty, and therefore, the developed models must be equipped with mechanisms to manage this type of information. A fuzzy ontology and a semantic hybrid system are presented to allow modelling and recognition of a set of complex real-life scenarios where vagueness and uncertainty are inherent to the human nature of the users that perform it. The handling of uncertain, incomplete and vague data (i.e., missing sensor readings and activity execution variations, since human behaviour is non-deterministic) is approached for the rst time through a fuzzy ontology validated on real-time settings within a hybrid data-driven and knowledgebased architecture. The semantics of activities, sub-activities and real-time object interaction are taken into consideration. The proposed framework consists of two main modules: the low-level sub-activity recognizer and the high-level activity recognizer. The rst module detects sub-activities (i.e., actions or basic activities) that take input data directly from a depth sensor (Kinect). The main contribution of this thesis tackles the second component of the hybrid system, which lays on top of the previous one, in a superior level of abstraction, and acquires the input data from the rst module's output, and executes ontological inference to provide users, activities and their in uence in the environment, with semantics. This component is thus knowledge-based, and a fuzzy ontology was designed to model the high-level activities. Since activity recognition requires context-awareness and the ability to discriminate among activities in di erent environments, the semantic framework allows for modelling common-sense knowledge in the form of a rule-based system that supports expressions close to natural language in the form of fuzzy linguistic labels. The framework advantages have been evaluated with a challenging and new public dataset, CAD-120, achieving an accuracy of 90.1% and 91.1% respectively for low and high-level activities. This entails an improvement over both, entirely data-driven approaches, and merely ontology-based approaches. As an added value, for the system to be su ciently simple and exible to be managed by non-expert users, and thus, facilitate the transfer of research to industry, a development framework composed by a programming toolbox, a hybrid crisp and fuzzy architecture, and graphical models to represent and con gure human behaviour in Smart Spaces, were developed in order to provide the framework with more usability in the nal application. As a result, human behaviour recognition can help assisting people with special needs such as in healthcare, independent elderly living, in remote rehabilitation monitoring, industrial process guideline control, and many other cases. This thesis shows use cases in these areas.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Over the last 30 years, new technologies and globalization have radically changed the way in which marketing is conducted. However, whereas their effects on business in general have been widely discussed, the focus of the effects on marketing remains without clear recognition. Global research has been made to shed light onto the issue, but it has widely concentrated on the views of executives as well as the consumer markets. In addition, a research gap is existent in applying the concept of marketing change in a specific business-to-business (B2B) industry. Therefore, the main research question this study seeks to answer is: “How is contemporary marketing conducted in the high-technology industry?” In this research, the researcher considers the specific industry of high-technology. However, as the industry is comprised of differing markets, the focus will be given to one of the industry’s prime sectors – the information technology (IT) markets, where companies offer other firms products or services manufactured with advanced technology. The growing IT-market is considered of critical importance in the economies of technologically ready countries such as Finland, where this research is also conducted. Through multiple case studies the researcher aims to describe how the changes in technology, customer engagement and future trends have shaped the way in which successful high-tech marketing is conducted in today’s marketplace. Then, results derived from the empirical research are presented to the reader with links to existing literature. As a conclusion, a generalized framework is constructed to depict and ideal marketer-customer relationship, with emphasis on dynamic, two-way communication and its supporting elements of customer analytics, change adaptation, strategic customer communication and organizational support. From a managerial point of view, the research may provide beneficial information as contemporary marketing can yield profitable outcomes if managed correctly. As a new way to grasp competitive advantage, strategic marketing is much more data-driven and customer-focused than ever before. The study can also prove to be relevant for the academic communities, while its results may act as inspiring for new focus on the education trends of future marketers. This study was limited to the internal activities done at the high-tech industry, leaving out the considerations for co-marketing, marketing via business partners or marketing at other B2B-industries.