933 resultados para Software Package Data Exchange (SPDX)
Resumo:
This thesis examines the suitability of VaR in foreign exchange rate risk management from the perspective of a European investor. The suitability of four different VaR models is evaluated in respect to have insight if VaR is a valuable tool in managing foreign exchange rate risk. The models evaluated are historical method, historical bootstrap method, variance-covariance method and Monte Carlo simulation. The data evaluated are divided into emerging and developed market currencies to have more intriguing analysis. The foreign exchange rate data in this thesis is from 31st January 2000 to 30th April 2014. The results show that the previously mentioned VaR models performance in foreign exchange risk management is not to be considered as a single tool in foreign exchange rate risk management. The variance-covariance method and Monte Carlo simulation performs poorest in both currency portfolios. Both historical methods performed better but should also be considered as an additional tool along with other more sophisticated analysis tools. A comparative study of VaR estimates and forward prices is also included in the thesis. The study reveals that regardless of the expensive hedging cost of emerging market currencies the risk captured by VaR is more expensive and thus FX forward hedging is recommended
Resumo:
The aim of the present study was to determine the ventilation/perfusion ratio that contributes to hypoxemia in pulmonary embolism by analyzing blood gases and volumetric capnography in a model of experimental acute pulmonary embolism. Pulmonary embolization with autologous blood clots was induced in seven pigs weighing 24.00 ± 0.6 kg, anesthetized and mechanically ventilated. Significant changes occurred from baseline to 20 min after embolization, such as reduction in oxygen partial pressures in arterial blood (from 87.71 ± 8.64 to 39.14 ± 6.77 mmHg) and alveolar air (from 92.97 ± 2.14 to 63.91 ± 8.27 mmHg). The effective alveolar ventilation exhibited a significant reduction (from 199.62 ± 42.01 to 84.34 ± 44.13) consistent with the fall in alveolar gas volume that effectively participated in gas exchange. The relation between the alveolar ventilation that effectively participated in gas exchange and cardiac output (V Aeff/Q ratio) also presented a significant reduction after embolization (from 0.96 ± 0.34 to 0.33 ± 0.17 fraction). The carbon dioxide partial pressure increased significantly in arterial blood (from 37.51 ± 1.71 to 60.76 ± 6.62 mmHg), but decreased significantly in exhaled air at the end of the respiratory cycle (from 35.57 ± 1.22 to 23.15 ± 8.24 mmHg). Exhaled air at the end of the respiratory cycle returned to baseline values 40 min after embolism. The arterial to alveolar carbon dioxide gradient increased significantly (from 1.94 ± 1.36 to 37.61 ± 12.79 mmHg), as also did the calculated alveolar (from 56.38 ± 22.47 to 178.09 ± 37.46 mL) and physiological (from 0.37 ± 0.05 to 0.75 ± 0.10 fraction) dead spaces. Based on our data, we conclude that the severe arterial hypoxemia observed in this experimental model may be attributed to the reduction of the V Aeff/Q ratio. We were also able to demonstrate that V Aeff/Q progressively improves after embolization, a fact attributed to the alveolar ventilation redistribution induced by hypocapnic bronchoconstriction.
Resumo:
Experimental models of sepsis-induced pulmonary alterations are important for the study of pathogenesis and for potential intervention therapies. The objective of the present study was to characterize lung dysfunction (low PaO2 and high PaCO2, and increased cellular infiltration, protein extravasation, and malondialdehyde (MDA) production assessed in bronchoalveolar lavage) in a sepsis model consisting of intraperitoneal (ip) injection of Escherichia coli and the protective effects of pentoxifylline (PTX). Male Wistar rats (weighing between 270 and 350 g) were injected ip with 10(7) or 10(9) CFU/100 g body weight or saline and samples were collected 2, 6, 12, and 24 h later (N = 5 each group). PaO2, PaCO2 and pH were measured in blood, and cellular influx, protein extravasation and MDA concentration were measured in bronchoalveolar lavage. In a second set of experiments either PTX or saline was administered 1 h prior to E. coli ip injection (N = 5 each group) and the animals were observed for 6 h. Injection of 10(7) or 10(9) CFU/100 g body weight of E. coli induced acidosis, hypoxemia, and hypercapnia. An increased (P < 0.05) cell influx was observed in bronchoalveolar lavage, with a predominance of neutrophils. Total protein and MDA concentrations were also higher (P < 0.05) in the septic groups compared to control. A higher tumor necrosis factor-alpha (P < 0.05) concentration was also found in these animals. Changes in all parameters were more pronounced with the higher bacterial inoculum. PTX administered prior to sepsis reduced (P < 0.05) most functional alterations. These data show that an E. coli ip inoculum is a good model for the induction of lung dysfunction in sepsis, and suitable for studies of therapeutic interventions.
Resumo:
The pipeline for macro- and microarray analyses (PMmA) is a set of scripts with a web interface developed to analyze DNA array data generated by array image quantification software. PMmA is designed for use with single- or double-color array data and to work as a pipeline in five classes (data format, normalization, data analysis, clustering, and array maps). It can also be used as a plugin in the BioArray Software Environment, an open-source database for array analysis, or used in a local version of the web service. All scripts in PMmA were developed in the PERL programming language and statistical analysis functions were implemented in the R statistical language. Consequently, our package is a platform-independent software. Our algorithms can correctly select almost 90% of the differentially expressed genes, showing a superior performance compared to other methods of analysis. The pipeline software has been applied to 1536 expressed sequence tags macroarray public data of sugarcane exposed to cold for 3 to 48 h. PMmA identified thirty cold-responsive genes previously unidentified in this public dataset. Fourteen genes were up-regulated, two had a variable expression and the other fourteen were down-regulated in the treatments. These new findings certainly were a consequence of using a superior statistical analysis approach, since the original study did not take into account the dependence of data variability on the average signal intensity of each gene. The web interface, supplementary information, and the package source code are available, free, to non-commercial users at http://ipe.cbmeg.unicamp.br/pub/PMmA.
Resumo:
The importance of package design as a marketing tool is growing as the competition in retail environment increases. However, there is a lack of studies on how each element of package design affects consumer decisions in different countries. The objective of this thesis is to study the role of package design to Japanese consumers. The research was conducted through an experiment with a sample of 37 Japanese female participants. They were divided into two groups and were given different tasks: one group had to choose a chocolate for themselves, and the other for a group of friends. The participants were presented with 15 different Finnish chocolate boxes to choose from. The qualitative data was gathered through observation and semi-structured interviews. In addition, data from questionnaires was quantified and all the data was triangulated. The empirical results suggest that visual elements strongly affect the decision making of Japanese consumers. Image was the most important element which acted as both, a visual and an informational aspect in the experiment. Informational elements on the other hand have little effect, especially when the context is written in a foreign language. However, informational elements affected participants who were choosing chocolates for a group of friends. A unique finding was the importance of kawaii (cuteness) to Japanese consumers.
Resumo:
In vivo proton magnetic resonance spectroscopy (¹H-MRS) is a technique capable of assessing biochemical content and pathways in normal and pathological tissue. In the brain, ¹H-MRS complements the information given by magnetic resonance images. The main goal of the present study was to assess the accuracy of ¹H-MRS for the classification of brain tumors in a pilot study comparing results obtained by manual and semi-automatic quantification of metabolites. In vivo single-voxel ¹H-MRS was performed in 24 control subjects and 26 patients with brain neoplasms that included meningiomas, high-grade neuroglial tumors and pilocytic astrocytomas. Seven metabolite groups (lactate, lipids, N-acetyl-aspartate, glutamate and glutamine group, total creatine, total choline, myo-inositol) were evaluated in all spectra by two methods: a manual one consisting of integration of manually defined peak areas, and the advanced method for accurate, robust and efficient spectral fitting (AMARES), a semi-automatic quantification method implemented in the jMRUI software. Statistical methods included discriminant analysis and the leave-one-out cross-validation method. Both manual and semi-automatic analyses detected differences in metabolite content between tumor groups and controls (P < 0.005). The classification accuracy obtained with the manual method was 75% for high-grade neuroglial tumors, 55% for meningiomas and 56% for pilocytic astrocytomas, while for the semi-automatic method it was 78, 70, and 98%, respectively. Both methods classified all control subjects correctly. The study demonstrated that ¹H-MRS accurately differentiated normal from tumoral brain tissue and confirmed the superiority of the semi-automatic quantification method.
Resumo:
Software quality has become an important research subject, not only in the Information and Communication Technology spheres, but also in other industries at large where software is applied. Software quality is not a happenstance; it is defined, planned and created into the software product throughout the Software Development Life Cycle. The research objective of this study is to investigate the roles of human and organizational factors that influence software quality construction. The study employs the Straussian grounded theory. The empirical data has been collected from 13 software companies, and the data includes 40 interviews. The results of the study suggest that tools, infrastructure and other resources have a positive impact on software quality, but human factors involved in the software development processes will determine the quality of the products developed. On the other hand, methods of development were found to bring little effect on software quality. The research suggests that software quality is an information-intensive process whereby organizational structures, mode of operation, and information flow within the company variably affect software quality. The results also suggest that software development managers influence the productivity of developers and the quality of the software products. Several challenges of software testing that affect software quality are also brought to light. The findings of this research are expected to benefit the academic community and software practitioners by providing an insight into the issues pertaining to software quality construction undertakings.
Resumo:
This study discusses the formation phase of Chinese-Finnish joint ventures in China. The purpose of this thesis is to create best practices for Finnish software companies in forming a joint venture with a local Chinese company in China. Therefore, the main research question, in what are the best practices for forming Sino-Finnish joint ventures in China for Finnish software firms, is examined through four different themes within the joint venture formation phase; the motives, the partner se- lection, the choice of a joint venture type and joint venture negotiations. The theoretical background of the study consists of literature relating to the establishment process of Sino-Western joint ventures in China. The empirical research conducted for this study is based on the expert interviews. The empirical data was gathered via nine semi-structured interviews with both Chinese and Finnish experts in software and technology industry, who have experience or knowledge in establishing Sino-Finnish joint ventures in China. Thematic analysis was used to cat- egorize and interpret the interview data. In addition, a thematic network was built to act as a basis of the analysis. According to the main findings, the main motives for Finnish software companies to establish a joint venture in China are lack of skills or experience, little resources to enter on their own, and China’s large market. The main motives for Chinese companies are to gain new technology or man- agerial skills, and expand internationally. The intellectual property rights (IPR) have recently im- proved a lot in China, but the Finnish companies’ knowledge on IPR is inadequate. The Finnish software companies should conduct a market and industry research in order to understand their po- sition in the market and to find a suitable location and potential joint venture partners. It is essential to define partner selection criteria and partner attributes. In addition, it is important to build the joint venture around complementary motives and a win-win situation between the joint venture partners. The Finnish companies should be prepared that the joint venture negotiations will be challenging and they will take a long time. The challenges can be overcome by gaining understanding about the Chinese culture and business environment. The findings of this study enhance understanding of the joint venture formation phase in China. This study provides guidelines for Finnish software companies to establish a joint venture in China. In addition, this study brings new insights to the Sino-Western joint venture literature with its soft- ware industry context. Future research is, however, necessary in order to gain an understanding of the advantages and disadvantages of a joint venture as an entry mode into China for Finnish soft- ware companies
Resumo:
This thesis presented the overview of Open Data research area, quantity of evidence and establishes the research evidence based on the Systematic Mapping Study (SMS). There are 621 such publications were identified published between years 2005 and 2014, but only 243 were selected in the review process. This thesis highlights the implications of Open Data principals’ proliferation in the emerging era of the accessibility, reusability and sustainability of data transparency. The findings of mapping study are described in quantitative and qualitative measurement based on the organization affiliation, countries, year of publications, research method, star rating and units of analysis identified. Furthermore, units of analysis were categorized by development lifecycle, linked open data, type of data, technical platforms, organizations, ontology and semantic, adoption and awareness, intermediaries, security and privacy and supply of data which are important component to provide a quality open data applications and services. The results of the mapping study help the organizations (such as academia, government and industries), re-searchers and software developers to understand the existing trend of open data, latest research development and the demand of future research. In addition, the proposed conceptual framework of Open Data research can be adopted and expanded to strengthen and improved current open data applications.
Resumo:
Many-core systems provide a great potential in application performance with the massively parallel structure. Such systems are currently being integrated into most parts of daily life from high-end server farms to desktop systems, laptops and mobile devices. Yet, these systems are facing increasing challenges such as high temperature causing physical damage, high electrical bills both for servers and individual users, unpleasant noise levels due to active cooling and unrealistic battery drainage in mobile devices; factors caused directly by poor energy efficiency. Power management has traditionally been an area of research providing hardware solutions or runtime power management in the operating system in form of frequency governors. Energy awareness in application software is currently non-existent. This means that applications are not involved in the power management decisions, nor does any interface between the applications and the runtime system to provide such facilities exist. Power management in the operating system is therefore performed purely based on indirect implications of software execution, usually referred to as the workload. It often results in over-allocation of resources, hence power waste. This thesis discusses power management strategies in many-core systems in the form of increasing application software awareness of energy efficiency. The presented approach allows meta-data descriptions in the applications and is manifested in two design recommendations: 1) Energy-aware mapping 2) Energy-aware execution which allow the applications to directly influence the power management decisions. The recommendations eliminate over-allocation of resources and increase the energy efficiency of the computing system. Both recommendations are fully supported in a provided interface in combination with a novel power management runtime system called Bricktop. The work presented in this thesis allows both new- and legacy software to execute with the most energy efficient mapping on a many-core CPU and with the most energy efficient performance level. A set of case study examples demonstrate realworld energy savings in a wide range of applications without performance degradation.
Resumo:
Finnish companies cross listing in the United States is an exceptional phenomenon. This study examines the cross listing decision, cross listing choice and cross listing process with associated challenges and critical factors. The aim is to create an in-depth understanding of the cross listing process and the required financial information. Based on that, the aim is to establish the process phases with the challenges and the critical factors that ought to be considered be- fore establishing the process plus re-evaluated and further considered at points in time during the process. The empirical part of this study is conducted as a qualitative study. The research data was collected through the adoption of two approaches, which are the interview approach and the textual data approach. The interviews were conducted with Finnish practitioners in the field of accounting and finance. The textual data was from publicly available publications of this phenomenon by the two BIG5 accounting companies worldwide. The results of this study demonstrate the benefits of cross listing in the U.S. are the better growth opportunities, the reduction of cost of capital and the production of higher quality financial information. In the decision making process companies should assess whether the benefits exceed the increased costs, the pressure for performance, the uncertainty of market recognition and the requirements of management. The exchange listing is seen as the most favourable cross listing choice for Finnish companies. The establishment of the processes for producing reliable, transparent and timely financial information was seen as both highly critical and very challenging. The critical success factors relating to the cross listing phases are the assessment and planning as well as the right mix of experiences and expertise. The timing plays important role in the process. The results mainly corroborate the literature concerning cross listing decision and choice. This study contributes to the literature on the cross listing process offering a useful model for the phases of the cross listing process.
Resumo:
An exchange traded fund (ETF) is a financial instrument that tracks some predetermined index. Since their initial establishment in 1993, ETFs have grown in importance in the field of passive investing. The main reason for the growth of the ETF industry is that ETFs combine benefits of stock investing and mutual fund investing. Although ETFs resemble mutual funds in many ways, also many differences occur. In addition, ETFs not only differ from mutual funds but also differ among each other. ETFs can be divided into two categories, i.e. market capitalisation ETFs and fundamental (or strategic) ETFs, and further into subcategories depending on their fundament basis. ETFs are a useful tool for diversification especially for a long-term investor. Although the economic importance of ETFs has risen drastically during the past 25 years, the differences and risk-return characteristics of fundamental ETFs have yet been rather unstudied area. In effect, no previous research on market capitalisation and fundamental ETFs was found during the research process. For its part, this thesis seeks to fill this research gap. The studied data consist of 50 market capitalisation ETFs and 50 fundamental ETFs. The fundaments, on which the indices that the fundamental ETFs track, were not limited nor segregated into subsections. The two types of ETFs were studied at an aggregate level as two different research groups. The dataset ranges from June 2006 to December 2014 with 103 monthly observations. The data was gathered using Bloomberg Terminal. The analysis was conducted as an econometric performance analysis. In addition to other econometric measures, the methods that were used in the performance analysis included modified Value-at-Risk, modified Sharpe ratio and Treynor ratio. The results supported the hypothesis that passive market capitalisation ETFs outperform active fundamental ETFs in terms of risk-adjusted returns, though the difference is rather small. Nevertheless, when taking into account the higher overall trading costs of the fundamental ETFs, the underperformance gap widens. According to the research results, market capitalisation ETFs are a recommendable diversification instrument for a long-term investor. In addition to better risk-adjusted returns, passive ETFs are more transparent and the bases of their underlying indices are simpler than those of fundamental ETFs. ETFs are still a young financial innovation and hence data is scarcely available. On future research, it would be valuable to research the differences in risk-adjusted returns also between the subsections of fundamental ETFs.
Resumo:
The present article aims to analyze the recent behavior of real exchange rate in Brazil and its effects over investment per worker in Brazilian manufacturing and extractive industry. Preliminary estimates presented in the article shows an over-valuation of 48% of real exchange rate in Brazil. The reaction between the level (and volatility) of real exchange rate and investment (per worker) in Brazil is analyzed by means of a panel data econometric model for 30 sectors of Brazilian manufacturing and extractive industry. The empirical results show that the level and volatility of real exchange rate has a strong effect over investment per worker in Brazilian industry. Finally, we conclude the article presenting a proposal for a new macroeconomic regime that aims to produce an acceleration of economic growth of Brazilian economy and, by that, a catching-up process with developed countries.
Resumo:
The traditional business models and the traditionally successful development methods that have been distinctive to the industrial era, do not satisfy the needs of modern IT companies. Due to the rapid nature of IT markets, the uncertainty of new innovations‟ success and the overwhelming competition with established companies, startups need to make quick decisions and eliminate wasted resources more effectively than ever before. There is a need for an empirical basis on which to build business models, as well as evaluate the presumptions regarding value and profit. Less than ten years ago, the Lean software development principles and practices became widely well-known in the academic circles. Those practices help startup entrepreneurs to validate their learning, test their assumptions and be more and more dynamical and flexible. What is special about today‟s software startups is that they are increasingly individual. There are quantitative research studies available regarding the details of Lean startups. Broad research with hundreds of companies presented in a few charts is informative, but a detailed study of fewer examples gives an insight to the way software entrepreneurs see Lean startup philosophy and how they describe it in their own words. This thesis focuses on Lean software startups‟ early phases, namely Customer Discovery (discovering a valuable solution to a real problem) and Customer Validation (being in a good market with a product which satisfies that market). The thesis first offers a sufficiently compact insight into the Lean software startup concept to a reader who is not previously familiar with the term. The Lean startup philosophy is then put into a real-life test, based on interviews with four Finnish Lean software startup entrepreneurs. The interviews reveal 1) whether the Lean startup philosophy is actually valuable for them, 2) how can the theory be practically implemented in real life and 3) does theoretical Lean startup knowledge compensate a lack of entrepreneurship experience. A reader gets familiar with the key elements and tools of Lean startups, as well as their mutual connections. The thesis explains why Lean startups waste less time and money than many other startups. The thesis, especially its research sections, aims at providing data and analysis simultaneously.
Resumo:
The traditional business models and the traditionally successful development methods that have been distinctive to the industrial era, do not satisfy the needs of modern IT companies. Due to the rapid nature of IT markets, the uncertainty of new innovations‟ success and the overwhelming competition with established companies, startups need to make quick decisions and eliminate wasted resources more effectively than ever before. There is a need for an empirical basis on which to build business models, as well as evaluate the presumptions regarding value and profit. Less than ten years ago, the Lean software development principles and practices became widely well-known in the academic circles. Those practices help startup entrepreneurs to validate their learning, test their assumptions and be more and more dynamical and flexible. What is special about today‟s software startups is that they are increasingly individual. There are quantitative research studies available regarding the details of Lean startups. Broad research with hundreds of companies presented in a few charts is informative, but a detailed study of fewer examples gives an insight to the way software entrepreneurs see Lean startup philosophy and how they describe it in their own words. This thesis focuses on Lean software startups‟ early phases, namely Customer Discovery (discovering a valuable solution to a real problem) and Customer Validation (being in a good market with a product which satisfies that market). The thesis first offers a sufficiently compact insight into the Lean software startup concept to a reader who is not previously familiar with the term. The Lean startup philosophy is then put into a real-life test, based on interviews with four Finnish Lean software startup entrepreneurs. The interviews reveal 1) whether the Lean startup philosophy is actually valuable for them, 2) how can the theory be practically implemented in real life and 3) does theoretical Lean startup knowledge compensate a lack of entrepreneurship experience. A reader gets familiar with the key elements and tools of Lean startups, as well as their mutual connections. The thesis explains why Lean startups waste less time and money than many other startups. The thesis, especially its research sections, aims at providing data and analysis simultaneously.