37 resultados para Sharable Content Object Resource Model (SCORM)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis a model for managing the product data in a product transfer project was created for ABB Machines. This model was then applied for the ongoing product transfer project during its planning phase. Detailed information about the demands and challenges in product transfer projects was acquired by analyzing previous product transfer projects in participating organizations. This analysis and the ABB Gate Model were then used as a base for the creation of the model for managing the product data in a product transfer project. The created model shows the main tasks during each phase in the project, their sub-tasks and relatedness on general level. Furthermore the model emphasizes need for detailed analysis of the situation during the project planning phase. The created model for managing the product data in a product transfer project was applied into ongoing project two main areas; manufacturing instructions and production item data. The results showed that the greatest challenge considering the product transfer project in previously mentioned areas is the current state of the product data. Based on the findings, process and resource proposals for both the ongoing product transfer project and the BU Machines were given. For manufacturing instructions it is necessary to create detailed process instructions in receiving organizations own language for each department so that the manufacturing instructions can be used as a training material during the training in sending organization. For production item data the English version of the bill of materials needs to be fully in English. In addition it needs to be ensured that bill of materials is updated and these changes implemented before the training in sending organization begins.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study we used market settlement prices of European call options on stock index futures to extract implied probability distribution function (PDF). The method used produces a PDF of returns of an underlying asset at expiration date from implied volatility smile. With this method, the assumption of lognormal distribution (Black-Scholes model) is tested. The market view of the asset price dynamics can then be used for various purposes (hedging, speculation). We used the so called smoothing approach for implied PDF extraction presented by Shimko (1993). In our analysis we obtained implied volatility smiles from index futures markets (S&P 500 and DAX indices) and standardized them. The method introduced by Breeden and Litzenberger (1978) was then used on PDF extraction. The results show significant deviations from the assumption of lognormal returns for S&P500 options while DAX options mostly fit the lognormal distribution. A deviant subjective view of PDF can be used to form a strategy as discussed in the last section.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current-day web search engines (e.g., Google) do not crawl and index a significant portion of theWeb and, hence, web users relying on search engines only are unable to discover and access a large amount of information from the non-indexable part of the Web. Specifically, dynamic pages generated based on parameters provided by a user via web search forms (or search interfaces) are not indexed by search engines and cannot be found in searchers’ results. Such search interfaces provide web users with an online access to myriads of databases on the Web. In order to obtain some information from a web database of interest, a user issues his/her query by specifying query terms in a search form and receives the query results, a set of dynamic pages that embed required information from a database. At the same time, issuing a query via an arbitrary search interface is an extremely complex task for any kind of automatic agents including web crawlers, which, at least up to the present day, do not even attempt to pass through web forms on a large scale. In this thesis, our primary and key object of study is a huge portion of the Web (hereafter referred as the deep Web) hidden behind web search interfaces. We concentrate on three classes of problems around the deep Web: characterization of deep Web, finding and classifying deep web resources, and querying web databases. Characterizing deep Web: Though the term deep Web was coined in 2000, which is sufficiently long ago for any web-related concept/technology, we still do not know many important characteristics of the deep Web. Another matter of concern is that surveys of the deep Web existing so far are predominantly based on study of deep web sites in English. One can then expect that findings from these surveys may be biased, especially owing to a steady increase in non-English web content. In this way, surveying of national segments of the deep Web is of interest not only to national communities but to the whole web community as well. In this thesis, we propose two new methods for estimating the main parameters of deep Web. We use the suggested methods to estimate the scale of one specific national segment of the Web and report our findings. We also build and make publicly available a dataset describing more than 200 web databases from the national segment of the Web. Finding deep web resources: The deep Web has been growing at a very fast pace. It has been estimated that there are hundred thousands of deep web sites. Due to the huge volume of information in the deep Web, there has been a significant interest to approaches that allow users and computer applications to leverage this information. Most approaches assumed that search interfaces to web databases of interest are already discovered and known to query systems. However, such assumptions do not hold true mostly because of the large scale of the deep Web – indeed, for any given domain of interest there are too many web databases with relevant content. Thus, the ability to locate search interfaces to web databases becomes a key requirement for any application accessing the deep Web. In this thesis, we describe the architecture of the I-Crawler, a system for finding and classifying search interfaces. Specifically, the I-Crawler is intentionally designed to be used in deepWeb characterization studies and for constructing directories of deep web resources. Unlike almost all other approaches to the deep Web existing so far, the I-Crawler is able to recognize and analyze JavaScript-rich and non-HTML searchable forms. Querying web databases: Retrieving information by filling out web search forms is a typical task for a web user. This is all the more so as interfaces of conventional search engines are also web forms. At present, a user needs to manually provide input values to search interfaces and then extract required data from the pages with results. The manual filling out forms is not feasible and cumbersome in cases of complex queries but such kind of queries are essential for many web searches especially in the area of e-commerce. In this way, the automation of querying and retrieving data behind search interfaces is desirable and essential for such tasks as building domain-independent deep web crawlers and automated web agents, searching for domain-specific information (vertical search engines), and for extraction and integration of information from various deep web resources. We present a data model for representing search interfaces and discuss techniques for extracting field labels, client-side scripts and structured data from HTML pages. We also describe a representation of result pages and discuss how to extract and store results of form queries. Besides, we present a user-friendly and expressive form query language that allows one to retrieve information behind search interfaces and extract useful data from the result pages based on specified conditions. We implement a prototype system for querying web databases and describe its architecture and components design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this thesis is to provide a business model framework that connects customer value to firm resources and explains the change logic of the business model. Strategic supply management and especially dynamic value network management as its scope, the dissertation is based on basic economic theories, transaction cost economics and the resource-based view. The main research question is how the changing customer values should be taken into account when planning business in a networked environment. The main question is divided into questions that form the basic research problems for the separate case studies presented in the five Publications. This research adopts the case study strategy, and the constructive research approach within it. The material consists of data from several Delphi panels and expert workshops, software pilot documents, company financial statements and information on investor relations on the companies’ web sites. The cases used in this study are a mobile multi-player game value network, smart phone and “Skype mobile” services, the business models of AOL, eBay, Google, Amazon and a telecom operator, a virtual city portal business system and a multi-play offering. The main contribution of this dissertation is bridging the gap between firm resources and customer value. This has been done by theorizing the business model concept and connecting it to both the resource-based view and customer value. This thesis contributes to the resource-based view, which deals with customer value and firm resources needed to deliver the value but has a gap in explaining how the customer value changes should be connected to the changes in key resources. This dissertation also provides tools and processes for analyzing the customer value preferences of ICT services, constructing and analyzing business models and business concept innovation and conducting resource analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to develop co-operation between business units of the company operating in graphic industry. The development was done by searching synergy opportunities between these business units. The final aim was to form a business model, which is based on co-operation of these business units.The literature review of this thesis examines synergies and especially the process concerning the search and implementation of synergies. Also the concept of business model and its components are examined. The research was done by using qualitative research method. The main data acquiring method to the empirical part was theme interviews. The data was analyzed using thematisation and content analysis.The results of the study include seven identified possible synergies and a business model, which is based on the co-operation of the business units. The synergy opportunities are evaluated and the implementation order of the synergies is suggested. The presented synergies create the base for the proposed business model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Local features are used in many computer vision tasks including visual object categorization, content-based image retrieval and object recognition to mention a few. Local features are points, blobs or regions in images that are extracted using a local feature detector. To make use of extracted local features the localized interest points are described using a local feature descriptor. A descriptor histogram vector is a compact representation of an image and can be used for searching and matching images in databases. In this thesis the performance of local feature detectors and descriptors is evaluated for object class detection task. Features are extracted from image samples belonging to several object classes. Matching features are then searched using random image pairs of a same class. The goal of this thesis is to find out what are the best detector and descriptor methods for such task in terms of detector repeatability and descriptor matching rate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study is to analyse the content of the interdisciplinary conversations in Göttingen between 1949 and 1961. The task is to compare models for describing reality presented by quantum physicists and theologians. Descriptions of reality indifferent disciplines are conditioned by the development of the concept of reality in philosophy, physics and theology. Our basic problem is stated in the question: How is it possible for the intramental image to match the external object?Cartesian knowledge presupposes clear and distinct ideas in the mind prior to observation resulting in a true correspondence between the observed object and the cogitative observing subject. The Kantian synthesis between rationalism and empiricism emphasises an extended character of representation. The human mind is not a passive receiver of external information, but is actively construing intramental representations of external reality in the epistemological process. Heidegger's aim was to reach a more primordial mode of understanding reality than what is possible in the Cartesian Subject-Object distinction. In Heidegger's philosophy, ontology as being-in-the-world is prior to knowledge concerning being. Ontology can be grasped only in the totality of being (Dasein), not only as an object of reflection and perception. According to Bohr, quantum mechanics introduces an irreducible loss in representation, which classically understood is a deficiency in knowledge. The conflicting aspects (particle and wave pictures) in our comprehension of physical reality, cannot be completely accommodated into an entire and coherent model of reality. What Bohr rejects is not realism, but the classical Einsteinian version of it. By the use of complementary descriptions, Bohr tries to save a fundamentally realistic position. The fundamental question in Barthian theology is the problem of God as an object of theological discourse. Dialectics is Barth¿s way to express knowledge of God avoiding a speculative theology and a human-centred religious self-consciousness. In Barthian theology, the human capacity for knowledge, independently of revelation, is insufficient to comprehend the being of God. Our knowledge of God is real knowledge in revelation and our words are made to correspond with the divine reality in an analogy of faith. The point of the Bultmannian demythologising programme was to claim the real existence of God beyond our faculties. We cannot simply define God as a human ideal of existence or a focus of values. The theological programme of Bultmann emphasised the notion that we can talk meaningfully of God only insofar as we have existential experience of his intervention. Common to all these twentieth century philosophical, physical and theological positions, is a form of anti-Cartesianism. Consequently, in regard to their epistemology, they can be labelled antirealist. This common insight also made it possible to find a common meeting point between the different disciplines. In this study, the different standpoints from all three areas and the conversations in Göttingen are analysed in the frameworkof realism/antirealism. One of the first tasks in the Göttingen conversations was to analyse the nature of the likeness between the complementary structures inquantum physics introduced by Niels Bohr and the dialectical forms in the Barthian doctrine of God. The reaction against epistemological Cartesianism, metaphysics of substance and deterministic description of reality was the common point of departure for theologians and physicists in the Göttingen discussions. In his complementarity, Bohr anticipated the crossing of traditional epistemic boundaries and the generalisation of epistemological strategies by introducing interpretative procedures across various disciplines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Comprehensive understanding of the heat transfer processes that take place during circulating fluidized bed (CFB) combustion is one of the most important issues in CFB technology development. This leads to possibility of predicting, evaluation and proper design of combustion and heat transfer mechanisms. The aim of this thesis is to develop a model for circulating fluidized bed boiler operation. Empirical correlations are used for determining heat transfer coefficients in each part of the furnace. The proposed model is used both in design and offdesign conditions. During off-design simulations fuel moisture content and boiler load effects on boiler operation have been investigated. In theoretical part of the thesis, fuel properties of most typical classes of biomass are widely reviewed. Various schemes of biomass utilization are presented and, especially, concerning circulating fluidized bed boilers. In addition, possible negative effects of biomass usage in boilers are briefly discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This Master´s thesis investigates the performance of the Olkiluoto 1 and 2 APROS model in case of fast transients. The thesis includes a general description of the Olkiluoto 1 and 2 nuclear power plants and of the most important safety systems. The theoretical background of the APROS code as well as the scope and the content of the Olkiluoto 1 and 2 APROS model are also described. The event sequences of the anticipated operation transients considered in the thesis are presented in detail as they will form the basis for the analysis of the APROS calculation results. The calculated fast operational transient situations comprise loss-of-load cases and two cases related to a inadvertent closure of one main steam isolation valve. As part of the thesis work, the inaccurate initial data values found in the original 1-D reactor core model were corrected. The input data needed for the creation of a more accurate 3-D core model were defined. The analysis of the APROS calculation results showed that while the main results were in good accordance with the measured plant data, also differences were detected. These differences were found to be caused by deficiencies and uncertainties related to the calculation model. According to the results the reactor core and the feedwater systems cause most of the differences between the calculated and measured values. Based on these findings, it will be possible to develop the APROS model further to make it a reliable and accurate tool for the analysis of the operational transients and possible plant modifications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The capabilities and thus, design complexity of VLSI-based embedded systems have increased tremendously in recent years, riding the wave of Moore’s law. The time-to-market requirements are also shrinking, imposing challenges to the designers, which in turn, seek to adopt new design methods to increase their productivity. As an answer to these new pressures, modern day systems have moved towards on-chip multiprocessing technologies. New architectures have emerged in on-chip multiprocessing in order to utilize the tremendous advances of fabrication technology. Platform-based design is a possible solution in addressing these challenges. The principle behind the approach is to separate the functionality of an application from the organization and communication architecture of hardware platform at several levels of abstraction. The existing design methodologies pertaining to platform-based design approach don’t provide full automation at every level of the design processes, and sometimes, the co-design of platform-based systems lead to sub-optimal systems. In addition, the design productivity gap in multiprocessor systems remain a key challenge due to existing design methodologies. This thesis addresses the aforementioned challenges and discusses the creation of a development framework for a platform-based system design, in the context of the SegBus platform - a distributed communication architecture. This research aims to provide automated procedures for platform design and application mapping. Structural verification support is also featured thus ensuring correct-by-design platforms. The solution is based on a model-based process. Both the platform and the application are modeled using the Unified Modeling Language. This thesis develops a Domain Specific Language to support platform modeling based on a corresponding UML profile. Object Constraint Language constraints are used to support structurally correct platform construction. An emulator is thus introduced to allow as much as possible accurate performance estimation of the solution, at high abstraction levels. VHDL code is automatically generated, in the form of “snippets” to be employed in the arbiter modules of the platform, as required by the application. The resulting framework is applied in building an actual design solution for an MP3 stereo audio decoder application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The usage of digital content, such as video clips and images, has increased dramatically during the last decade. Local image features have been applied increasingly in various image and video retrieval applications. This thesis evaluates local features and applies them to image and video processing tasks. The results of the study show that 1) the performance of different local feature detector and descriptor methods vary significantly in object class matching, 2) local features can be applied in image alignment with superior results against the state-of-the-art, 3) the local feature based shot boundary detection method produces promising results, and 4) the local feature based hierarchical video summarization method shows promising new new research direction. In conclusion, this thesis presents the local features as a powerful tool in many applications and the imminent future work should concentrate on improving the quality of the local features.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This applied linguistic study in the field of second language acquisition investigated the assessment practices of class teachers as well as the challenges and visions of language assessment in bilingual content instruction (CLIL) at primary level in Finnish basic education. Furthermore, pupils’ and their parents’ perceptions of language assessment and LangPerform computer simulations as an alternative, modern assessment method in CLIL contexts were examined. The study was conducted for descriptive and developmental purposes in three phases: 1) a CLIL assessment survey; 2) simulation 1; and 3) simulation 2. All phases had a varying number of participants. The population of this mixed methods study were CLIL class teachers, their pupils and the pupils’ parents. The sampling was multi-staged and based on probability and random sampling. The data were triangulated. Altogether 42 CLIL class teachers nationwide, 109 pupils from the 3rd, 4th and 5th grade as well as 99 parents from two research schools in South-Western Finland participated in the CLIL assessment survey followed by an audio-recorded theme interview of volunteers (10 teachers, 20 pupils and 7 parents). The simulation experimentations 1 and 2 produced 146 pupil and 39 parental questionnaires as well as video interviews of volunteered pupils. The data were analysed both quantitatively using percentages and numerical frequencies and qualitatively employing thematic content analysis. Based on the data, language assessment in primary CLIL is not an established practice. It largely appears to be infrequent, incidental, implicit and based on impressions rather than evidence or the curriculum. The most used assessment methods were teacher observation, bilingual tests and dialogic interaction, and the least used were portfolios, simulations and peer assessment. Although language assessment was generally perceived as important by teachers, a fifth of them did not gather assessment information systematically, and 38% scarcely gave linguistic feedback to pupils. Both pupils and parents wished to receive more information on CLIL language issues; 91% of pupils claimed to receive feedback rarely or occasionally, and 63% of them wished to get more information on their linguistic coping in CLIL subjects. Of the parents, 76% wished to receive more information on the English proficiency of their children and their linguistic development. This may be a response to indirect feedback practices identified in this study. There are several challenges related to assessment; the most notable is the lack of a CLIL curriculum, language objectives and common ground principles of assessment. Three diverse approaches to language in CLIL that appear to affect teachers’ views on language assessment were identified: instrumental (language as a tool), dual (language as a tool and object of learning) and eclectic (miscellaneous views, e.g. affective factors prioritised). LangPerform computer simulations seem to be perceived as an appropriate alternative assessment method in CLIL. It is strongly recommended that the fundamentals for assessment (curricula and language objectives) and a mutual assessment scheme should be determined and stakeholders’ knowledge base of CLIL strengthened. The principles of adequate assessment in primary CLIL are identified as well as several appropriate assessment methods suggested.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An augmented reality (AR) device must know observer’s location and orientation, i.e. observer’s pose, to be able to correctly register the virtual content to observer’s view. One possible way to determine and continuously follow-up the pose is model-based visual tracking. It supposes that a 3D model of the surroundings is known and that there is a video camera that is fixed to the device. The pose is tracked by comparing the video camera image to the model. Each new pose estimate is usually based on the previous estimate. However, the first estimate must be found out without a prior estimate, i.e. the tracking must be initialized, which in practice means that some model features must be identified from the image and matched to model features. This is known in literature as model-to-image registration problem or simultaneous pose and correspondence problem. This report reviews visual tracking initialization methods that are suitable for visual tracking in ship building environment when the ship CAD model is available. The environment is complex, which makes the initialization non-trivial. The report has been done as part of MARIN project.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Vuoden 2016 alusta alkaen astuu Suomessa voimaan orgaanisen jätteen kaatopaikkakielto, joka rajoittaa voimakkaasti biohajoavan tai muun orgaanisen aineksen sijoittamista kaatopaikalle. Kiellon tavoite on ohjata orgaaniset jätevirrat hyödyntämiskäsittelyyn ja vähentää kaatopaikkasijoituksen ympäristövaikutuksia. Tämän diplomityön tavoitteena oli orgaanisen jätteen kaatopaikkakieltoon varautumiseksi luoda toimintamalli jätekuormien vastaanottoon, tarkastukseen ja käsittelyyn Keltakankaan jätekeskuksessa. Vuonna 2016 käyttöön otettavassa toimintamallissa kaatopaikalle sijoitetaan vain kaatopaikkakelpoiseksi todistettuja teollisuuden jätteitä ja asbestijätettä, epäorgaanisia jakeita sekä sekalaisen jätteen lajittelussa tai mekaanisessa käsittelyssä syntyviä hyödyntämiskelvottomia rejektejä. Usea tällä hetkellä kaatopaikalle sijoitettava jätelaji ohjautuu vaihtoehtoiseen käsittelyyn. Toimintamallin mukaan kaikki jätekeskukseen vastaanotettavat sekalaiset jätekuormat ohjataan tarkastettavaksi, esikäsiteltäväksi ja tarvittaessa mekaaniseen lajittelulinjastoon kaatopaikkasijoittamisen sijaan. Ongelmallinen jae on hyödyntämiskelvoton PVC-muovi, joka nostaa rejektien orgaanisen aineksen pitoisuutta. Niin kauan kuin PVC:lle ei ole olemassa hyötykäyttökohdetta, se ohjautunee poikkeusluvalla jätteenkäsittelyn rejektien mukana kaatopaikalle. Aiemmin sekalaisesta jätteestä 70–80 % on sijoitettu kaatopaikalle, mutta uusi lajittelulinjasto mahdollistaa käsiteltävän jätemäärä voimakkaan kasvattamisen. Toimintamallin testausvaiheessa vastaanotetuista sekalaisista kuormista luokiteltiin 11 % energiahyödynnettäviksi kotitalousjätteen kaltaisiksi, 69 % lajittelulinjastossa käsiteltäviksi ja 20 % hyödyntämiskelvottomaksi kaatopaikkajätteeksi. Kaatopaikalle sijoitettava kokonaisjätemäärä on uuden toimintamallin myötä mahdollista puolittaa, kun sekalaisen jätteen käsittelyä lisätään ja useat muut jätelajit ohjautuvat vaihtoehtoiseen käsittelyyn.