899 resultados para information bottleneck method
Resumo:
The main target of the study was to examine how Fortum’s tax reporting system could be developed in a way that it collects required information which is also easily transferable to the financial statements. This included examining disclosure requirements for income taxes under IFRS and US GAAP. By benchmarking some Finnish, European and US companies the purpose was to get perspective in what extend they present their tax information in their financial statements. Also material weakness, its existence, was under examination. The research method was qualitative, descriptive and normative. The research material included articles and literature of the tax reporting and standards relating to it. The interviews made had a notable significance. The study pointed out that Fortum’s tax reporting is in good shape and it does not require big changes. The biggest renewal of the tax reporting system is that there is only one model for all Fortum’s companies. It is also more automated, quicker, and more efficient and it reminds more the notes in its shape. In addition it has more internal controls to improve quality and efficiency of the reporting process.
Resumo:
In this paper a colour texture segmentation method, which unifies region and boundary information, is proposed. The algorithm uses a coarse detection of the perceptual (colour and texture) edges of the image to adequately place and initialise a set of active regions. Colour texture of regions is modelled by the conjunction of non-parametric techniques of kernel density estimation (which allow to estimate the colour behaviour) and classical co-occurrence matrix based texture features. Therefore, region information is defined and accurate boundary information can be extracted to guide the segmentation process. Regions concurrently compete for the image pixels in order to segment the whole image taking both information sources into account. Furthermore, experimental results are shown which prove the performance of the proposed method
Resumo:
There is an intense debate on the convenience of moving from historical cost (HC) toward the fair value (FV) principle. The debate and academic research is usually concerned with financial instruments, but the IAS 41 requirement of fair valuation for biological assets brings it into the agricultural domain. This paper performs an empirical study with a sample of Spanish farms valuing biological assets at HC and a sample applying FV, finding no significant differences between both valuation methods to assess future cash flows. However, most tests reveal more predictive power of future earnings under fair valuation of biological assets, which is not explained by differences in volatility of earnings and profitability. The study also evidences the existence of flawed HC accounting practices for biological assets in agriculture, which suggests scarce information content of this valuation method in the predominant small business units existing in the agricultural sector in advanced Western countries
Resumo:
The purpose of this Thesis was to study what is the present situation of Business Intelligence of the company unit. This means how efficiently unit uses possibilities of modern information management systems. The aim was to resolve how operative informa-tion management of unit’s tender process could be improved by modern information technology applications. This makes it possible that tender processes could be faster and more efficiency. At the beginning it was essential to acquaint oneself with written literature of Business Intelligence. Based on Business Intelligence theory is was relatively easy but challenging to search and discern how tender business could be improved by methods of Busi-ness Intelligence. The empirical phase of this study was executed as qualitative research method. This phase includes theme and natural interviews on the company. Problems and challenges of tender process were clarified in a part an empirical phase. Group of challenges were founded when studying information management of company unit. Based on theory and interviews, group of improvements were listed which company could possible do in the future when developing its operative processes.
Resumo:
Fluent health information flow is critical for clinical decision-making. However, a considerable part of this information is free-form text and inabilities to utilize it create risks to patient safety and cost-effective hospital administration. Methods for automated processing of clinical text are emerging. The aim in this doctoral dissertation is to study machine learning and clinical text in order to support health information flow.First, by analyzing the content of authentic patient records, the aim is to specify clinical needs in order to guide the development of machine learning applications.The contributions are a model of the ideal information flow,a model of the problems and challenges in reality, and a road map for the technology development. Second, by developing applications for practical cases,the aim is to concretize ways to support health information flow. Altogether five machine learning applications for three practical cases are described: The first two applications are binary classification and regression related to the practical case of topic labeling and relevance ranking.The third and fourth application are supervised and unsupervised multi-class classification for the practical case of topic segmentation and labeling.These four applications are tested with Finnish intensive care patient records.The fifth application is multi-label classification for the practical task of diagnosis coding. It is tested with English radiology reports.The performance of all these applications is promising. Third, the aim is to study how the quality of machine learning applications can be reliably evaluated.The associations between performance evaluation measures and methods are addressed,and a new hold-out method is introduced.This method contributes not only to processing time but also to the evaluation diversity and quality. The main conclusion is that developing machine learning applications for text requires interdisciplinary, international collaboration. Practical cases are very different, and hence the development must begin from genuine user needs and domain expertise. The technological expertise must cover linguistics,machine learning, and information systems. Finally, the methods must be evaluated both statistically and through authentic user-feedback.
Resumo:
The objective of this master’s thesis was to find means and measures by which an industrial manufacturing company could find cost-competitive solutions in a price-driven market situation. Initially, it was essential to find individual high customer value spots from the offering. The study addressed this in an innovative way by providing the desired information for the entire range of offering. The research was carried out using the constructivist approach method. Firstly, the project and solution marketing literature was reviewed in order to establish an overview of the processes and strategies involved. This information was then used in conjunction with the company’s specific offering data to conduct a construction. This construction can be used in various functions within the target company to streamline and optimize the specifications into so-called “preferred offers”. The study also presents channels and methods with which to exploit the construction in practice in the target company. The study aimed to bring concrete improvements in competitiveness and profitability. One result of this study was the creation of a training material for internal use. This material is now used in several countries to inform and present to the staff the cost-competitive aspects of the target company’s offering.
Resumo:
Due to the lack of information concerning maximum rainfall equations for most locations in Mato Grosso do Sul State, the alternative for carrying out hydraulic work projects has been information from meteorological stations closest to the location in which the project is carried out. Alternative methods, such as 24 hours rain disaggregation method from rainfall data due to greater availability of stations and longer observations can work. Based on this approach, the objective of this study was to estimate maximum rainfall equations for Mato Grosso do Sul State by adjusting the 24 hours rain disaggregation method, depending on data obtained from rain gauge stations from Dourado and Campo Grande. For this purpose, data consisting of 105 rainfall stations were used, which are available in the ANA (Water Resources Management National Agency) database. Based on the results we concluded: the intense rainfall equations obtained by pluviogram analysis showed determination coefficient above 99%; and the performance of 24 hours rain disaggregation method was classified as excellent, based on relative average error WILMOTT concordance index (1982).
Resumo:
This study is dedicated to search engine marketing (SEM). It aims for developing a business model of SEM firms and to provide explicit research of trustworthy practices of virtual marketing companies. Optimization is a general term that represents a variety of techniques and methods of the web pages promotion. The research addresses optimization as a business activity, and it explains its role for the online marketing. Additionally, it highlights issues of unethical techniques utilization by marketers which created relatively negative attitude to them on the Internet environment. Literature insight combines in the one place both technical and economical scientific findings in order to highlight technological and business attributes incorporated in SEM activities. Empirical data regarding search marketers was collected via e-mail questionnaires. 4 representatives of SEM companies were engaged in this study to accomplish the business model design. Additionally, the fifth respondent was a representative of the search engine portal, who provided insight on relations between search engines and marketers. Obtained information of the respondents was processed qualitatively. Movement of commercial organizations to the online market increases demand on promotional programs. SEM is the largest part of online marketing, and it is a prerogative of search engines portals. However, skilled users, or marketers, are able to implement long-term marketing programs by utilizing web page optimization techniques, key word consultancy or content optimization to increase web site visibility to search engines and, therefore, user’s attention to the customer pages. SEM firms are related to small knowledge-intensive businesses. On the basis of data analysis the business model was constructed. The SEM model includes generalized constructs, although they represent a wider amount of operational aspects. Constructing blocks of the model includes fundamental parts of SEM commercial activity: value creation, customer, infrastructure and financial segments. Also, approaches were provided on company’s differentiation and competitive advantages evaluation. It is assumed that search marketers should apply further attempts to differentiate own business out of the large number of similar service providing companies. Findings indicate that SEM companies are interested in the increasing their trustworthiness and the reputation building. Future of the search marketing is directly depending on search engines development.
Resumo:
Through advances in technology, System-on-Chip design is moving towards integrating tens to hundreds of intellectual property blocks into a single chip. In such a many-core system, on-chip communication becomes a performance bottleneck for high performance designs. Network-on-Chip (NoC) has emerged as a viable solution for the communication challenges in highly complex chips. The NoC architecture paradigm, based on a modular packet-switched mechanism, can address many of the on-chip communication challenges such as wiring complexity, communication latency, and bandwidth. Furthermore, the combined benefits of 3D IC and NoC schemes provide the possibility of designing a high performance system in a limited chip area. The major advantages of 3D NoCs are the considerable reductions in average latency and power consumption. There are several factors degrading the performance of NoCs. In this thesis, we investigate three main performance-limiting factors: network congestion, faults, and the lack of efficient multicast support. We address these issues by the means of routing algorithms. Congestion of data packets may lead to increased network latency and power consumption. Thus, we propose three different approaches for alleviating such congestion in the network. The first approach is based on measuring the congestion information in different regions of the network, distributing the information over the network, and utilizing this information when making a routing decision. The second approach employs a learning method to dynamically find the less congested routes according to the underlying traffic. The third approach is based on a fuzzy-logic technique to perform better routing decisions when traffic information of different routes is available. Faults affect performance significantly, as then packets should take longer paths in order to be routed around the faults, which in turn increases congestion around the faulty regions. We propose four methods to tolerate faults at the link and switch level by using only the shortest paths as long as such path exists. The unique characteristic among these methods is the toleration of faults while also maintaining the performance of NoCs. To the best of our knowledge, these algorithms are the first approaches to bypassing faults prior to reaching them while avoiding unnecessary misrouting of packets. Current implementations of multicast communication result in a significant performance loss for unicast traffic. This is due to the fact that the routing rules of multicast packets limit the adaptivity of unicast packets. We present an approach in which both unicast and multicast packets can be efficiently routed within the network. While suggesting a more efficient multicast support, the proposed approach does not affect the performance of unicast routing at all. In addition, in order to reduce the overall path length of multicast packets, we present several partitioning methods along with their analytical models for latency measurement. This approach is discussed in the context of 3D mesh networks.
Resumo:
In this work, we present the solution of a class of linear inverse heat conduction problems for the estimation of unknown heat source terms, with no prior information of the functional forms of timewise and spatial dependence of the source strength, using the conjugate gradient method with an adjoint problem. After describing the mathematical formulation of a general direct problem and the procedure for the solution of the inverse problem, we show applications to three transient heat transfer problems: a one-dimensional cylindrical problem; a two-dimensional cylindrical problem; and a one-dimensional problem with two plates.
Resumo:
Long-term independent budget travel to countries far away has become increasingly common over the last few decades, and backpacking has now entered the tourism mainstream. Nowadays, backpackers are a very important segment of the global travel market. Backpacking is a type of tourism that involves a lot of information search activities. The Internet has become a major source of information as well as a platform for tourism business transactions. It allows travelers to gain information very effortlessly and to learn about tourist destinations and products directly from other travelers in the form of electronic word-of-mouth (eWOM). Social media has penetrated and changed the backpacker market, as now modern travelers can stay connected to people at home, read online recommendations, and organize and book their trips very independently. In order to create a wider understanding on modern-day backpackers and their information search and share behavior in the Web 2.0 era, this thesis examined contemporary backpackers and their use of social media as an information and communication platform. In order to achieve this goal, three sub-objectives were identified: 1. to describe contemporary backpacker tourism 2. to examine contemporary backpackers’ travel information search and share behavior 3. to explore the impacts of new information and communications technologies and Web 2.0 on backpacker tourism The empirical data was gathered with an online survey, thus the method of analysis was mainly quantitative, and a qualitative method was used for a brief analysis of open questions. The research included both descriptive and analytical approaches, as the goal was to describe modern-day backpackers, and to examine possible interdependencies between information search and share behavior and background variables. The interdependencies were tested for statistical significance with the help of five research hypotheses. The results suggested that backpackers no longer fall under the original backpacker definitions described some decades ago. Now, they are mainly short-term travelers, whose trips resemble more those of mainstream tourists. They use communication technologies very actively, and particularly social media. Traditional information sources, mainly guide books and recommendations from friends, are of great importance to them but also eWOM sources are widely used in travel decision making. The use of each source varies according to the stage of the trip. All in all, Web 2.0 and new ICTs have transformed the backpacker tourism industry in many ways. Although the experience has become less authentic in some travelers’ eyes, the backpacker culture is still recognizable.
Resumo:
Concentration-response curves of isometric tension studies on isolated blood vessels are obtained traditionally. Although parameters such as Imax, EC50 and pA2 may be readily calculated, this method does not provide information on the temporal profile of the responses or the actual nature of the reaction curves. Computerized data acquisition systems can be used to obtain average data that represent a new source of otherwise inaccessible information, since early and late responses may be observed separately in detail
Resumo:
In the present study, histopathological analysis of rat mesentery was used to quantify the effect of two anti-inflammatory agents, dexamethasone (Dex) and pertussis toxin (Ptx), on leukocyte migration. The intravenous injection of Dex (1 mg/kg) and Ptx (1,200 ng) 1 h prior to the intraperitoneal injection of the inflammatory stimuli lipopolysaccharide (LPS) or formyl-methionyl-leucyl-phenylalanine (fMLP) significantly reduced the neutrophil diapedesis (LPS: Ptx = 0.86 ± 0.19 and Dex = 0.35 ± 0.13 vs saline (S) = 2.85 ± 0.59; fMLP: Ptx = 0.43 ± 0.09 and Dex 0.01 ± 0.01 vs S = 1.08 ± 0.15 neutrophil diapedesis/field) and infiltration (LPS: Ptx = 6.29 ± 1.4 and Dex = 3.06 ± 0.76 vs S = 15.94 ± 3.97; fMLP: Ptx = 3.85 ± 0.56 and Dex = 0.40 ± 0.16 vs S = 7.15 ± 1.17 neutrophils/field) induced by the two agonists in the rat mesentery. The inhibitory effect of Dex and Ptx was clearly visible in the fields nearest the venule (up to 200 µm), demonstrating that these anti-inflammatory agents act preferentially in the transmigration of neutrophils from the vascular lumen into the interstitial space, but not in cell movement in response to a haptotactic gradient. The mesentery of rats pretreated with Dex showed a decreased number of neutrophils within the venules (LPS: Dex = 1.50 ± 0.38 vs S = 4.20 ± 1.01; fMLP: Dex = 0.25 ± 0.11 vs S = 2.20 ± 0.34 neutrophils in the lumen/field), suggesting that this inhibitor may be acting at a step that precedes neutrophil arrival in the inflamed tissue. In contrast to that observed with Dex treatment, the number of neutrophils found in mesenteric venules was significantly elevated in animals pretreated with Ptx (LPS: Ptx = 9.85 ± 2.25 vs S = 4.20 ± 1.01; fMLP: Ptx = 4.66 ± 1.24 vs S = 2.20 ± 0.34 neutrophils in the lumen/field). This discrepancy shows that Ptx and Dex act via different mechanisms and suggests that Ptx prevents locomotion of neutrophils from the vascular lumen to the interstitial space. In conclusion, the method described here is useful for quantifying the inflammatory and anti-inflammatory effect of different substances. The advantage of this histopathological approach is that it provides additional information about the steps involved in leucocyte migration.
Resumo:
Speed, uncertainty and complexity are increasing in the business world all the time. When knowledge and skills become quickly irrelevant, new challenges are set for information technology (IT) education. Meta-learning skills – learning how to learn rapidly - and innovation skills have become more essential than single technologies or other specific issues. The drastic changes in the information and communications technology (ICT) sector have caused a need to reconsider how IT Bachelor education in Universities of Applied Sciences should be organized and employed to cope with the change. The objective of the study was to evaluate how a new approach to IT Bachelor education, the ICT entrepreneurship study path (ICT-ESP) fits IT Bachelor education in a Finnish University of Applied Sciences. This kind of educational arrangement has not been employed elsewhere in the context of IT Bachelor education. The study presents the results of a four-year period during which IT Bachelor education was renewed in a Finnish University of Applied Sciences. The learning environment was organized into an ICT-ESP based on Nonaka’s knowledge theory and Kolb’s experiental learning. The IT students who studied in the ICT-ESP established a cooperative and learned ICT by running their cooperative at the University of Applied Sciences. The students (called team entrepreneurs) studied by reading theory in books and other sources of explicit information, doing projects for their customers, and reflecting in training sessions on what was learnt by doing and by studying the literature. Action research was used as the research strategy in this study. Empirical data was collected via theme-based interviews, direct observation, and participative observation. Grounded theory method was utilized in the data analysis and the theoretical sampling was used to guide the data collection. The context of the University of Applied Sciences provided a good basis for fostering team entrepreneurship. However, the results showed that the employment of the ICT-ESP did not fit into the IT Bachelor education well enough. The ICT-ESP was cognitively too tough for the team entrepreneurs because they had two different set of rules to follow in their studies. The conventional courses engaged lot of energy which should have been spent for professional development in the ICT-ESP. The amount of competencies needed in the ICT-ESP for professional development was greater than those needed for any other ways of studying. The team entrepreneurs needed to develop skills in ICT, leadership and self-leadership, team development and entrepreneurship skills. The entrepreneurship skills included skills on marketing and sales, brand development, productization, and business administration. Considering the three-year time the team entrepreneurs spent in the ICT-ESP, the challenges were remarkable. Changes to the organization of IT Bachelor education are also suggested in the study. At first, it should be admitted that the ICT-ESP produces IT Bachelors with a different set of competencies compared to the conventional way of educating IT Bachelors. Secondly, the number of courses on general topics in mathematics, physics, and languages for team entrepreneurs studying in the ICTESP should be reconsidered and the conventional course-based teaching of the topics should be reorganized to support the team coaching process of the team entrepreneurs with their practiceoriented projects. Third, the upcoming team entrepreneurs should be equipped with relevant information about the ICT-ESP and what it would require in practice to study as a team entrepreneur. Finally, the upcoming team entrepreneurs should be carefully selected before they start in the ICT-ESP to have a possibility to eliminate solo players and those who have a too romantic view of being a team entrepreneur. The results gained in the study provided answers to the original research questions and the objectives of the study were met. Even though the IT degree programme was terminated during the research process, the amount of qualitative data gathered made it possible to justify the interpretations done.
Resumo:
Interest towards working capital management increased among practitioners and researchers because the financial crisis of 2008 caused the deterioration of the general financial situation. The importance of managing working capital effectively increased dramatically during the financial crisis. On one hand, companies highlighted the importance of working capital management as part of short-term financial management to overcome funding difficulties. On the other hand, in academia, it has been highlighted the need to analyze working capital management from a wider perspective namely from the value chain perspective. Previously, academic articles mostly discussed working capital management from a company-centered perspective. The objective of this thesis was to put working capital management in a wider and more academic perspective and present case studies of the value chains of industries as instrumental in theoretical contributions and practical contributions as complementary to theoretical contributions and conclusions. The principal assumption of this thesis is that selffinancing of value chains can be established through effective working capital management. Thus, the thesis introduces the financial value chain analysis method which is employed in the empirical studies. The effectiveness of working capital management of the value chains is studied through the cycle time of working capital. The financial value chain analysis method employed in this study is designed for considering value chain level phenomena. This method provides a holistic picture of the value chain through financial figures. It extends the value chain analysis to the industry level. Working capital management is studied by the cash conversion cycle that measures the length (days) of time a company has funds tied up in working capital, starting from the payment of purchases to the supplier and ending when remittance of sales is received from the customers. The working capital management practices employed in the automotive, pulp and paper and information and communication technology industries have been studied in this research project. Additionally, the Finnish pharmaceutical industry is studied to obtain a deeper understanding of the working capital management of the value chain. The results indicate that the cycle time of working capital is constant in the value chain context over time. The cash conversion cycle of automotive, pulp and paper, and ICT industries are on average 70, 60 and 40 days, respectively. The difference is mainly a consequence of the different cycle time of inventories. The financial crisis of 2008 affected the working capital management of the industries similarly. Both the cycle time of accounts receivable and accounts payable increased between 2008 and 2009. The results suggest that the companies of the automotive, pulp and paper and ICT value chains were not able to self-finance. Results do not indicate the improvement of value chains position in regard to working capital management either. The findings suggest that companies operating in the Finnish pharmaceutical industry are interested in developing their own working capital management, but collaboration with the value chain partners is not considered interesting. Competition no longer occurs between individual companies, but between value chains. Therefore the financial value chain analysis method introduced in this thesis has the potential to support value chains in improving their competitiveness.