905 resultados para Negative dimensional integration method (NDIM)
Resumo:
Frontier and Emerging economies have implemented policies with the objective of liberalizing their equity markets. Equity market liberalization opens the domestic equity market to foreign investors and as well paves the way for domestic investors to invest in foreign equity securities. Among other things, equity market liberalization results in diversification benefits. Moreover, equity market liberalization leads to low cost of equity capital resulting from the lower rate of return by investors. Additionally, foreign and local investors share any potential risks. Liberalized equity markets also become liquid considering that there are more investors to trade. Equity market liberalization results in financial integration which explains the movement of two markets. In crisis period, increased volatility and co-movement between two markets may result in what is termed contagion effects. In Africa, major moves toward financial liberalization generally started in the late 1980s with South Africa as the pioneer. Over the years, researchers have studied the impact of financial liberalization on Africa’s economic development with diverse results; some being positive, others negative and still others being mixed. The objective of this study is to establish whether African stock-markets are integrated into the United States (US) and World market. Furthermore, the study helps to see if there are international linkages between the Africa, US and the world markets. A Bivariate- VAR- GARCH- BEKK model is employed in the study. In the study, the effect of thin trading is removed through series of econometric data purification. This is because thin trading, also known as non-trading or inconsistency of trading, is a main feature of African markets and may trigger inconsistency and biased results. The study confirmed the widely established results that the South Africa and Egypt stock markets are highly integrated with the US and World market. Interestingly, the study adds to knowledge in this research area by establishing the fact that Kenya is very integrated with the US and World markets and that it receives and exports past innovations as well as shocks to and from the US and World market.
Resumo:
Digital business ecosystems (DBE) are becoming an increasingly popular concept for modelling and building distributed systems in heterogeneous, decentralized and open environments. Information- and communication technology (ICT) enabled business solutions have created an opportunity for automated business relations and transactions. The deployment of ICT in business-to-business (B2B) integration seeks to improve competitiveness by establishing real-time information and offering better information visibility to business ecosystem actors. The products, components and raw material flows in supply chains are traditionally studied in logistics research. In this study, we expand the research to cover the processes parallel to the service and information flows as information logistics integration. In this thesis, we show how better integration and automation of information flows enhance the speed of processes and, thus, provide cost savings and other benefits for organizations. Investments in DBE are intended to add value through business automation and are key decisions in building up information logistics integration. Business solutions that build on automation are important sources of value in networks that promote and support business relations and transactions. Value is created through improved productivity and effectiveness when new, more efficient collaboration methods are discovered and integrated into DBE. Organizations, business networks and collaborations, even with competitors, form DBE in which information logistics integration has a significant role as a value driver. However, traditional economic and computing theories do not focus on digital business ecosystems as a separate form of organization, and they do not provide conceptual frameworks that can be used to explore digital business ecosystems as value drivers—combined internal management and external coordination mechanisms for information logistics integration are not the current practice of a company’s strategic process. In this thesis, we have developed and tested a framework to explore the digital business ecosystems developed and a coordination model for digital business ecosystem integration; moreover, we have analysed the value of information logistics integration. The research is based on a case study and on mixed methods, in which we use the Delphi method and Internetbased tools for idea generation and development. We conducted many interviews with key experts, which we recoded, transcribed and coded to find success factors. Qualitative analyses were based on a Monte Carlo simulation, which sought cost savings, and Real Option Valuation, which sought an optimal investment program for the ecosystem level. This study provides valuable knowledge regarding information logistics integration by utilizing a suitable business process information model for collaboration. An information model is based on the business process scenarios and on detailed transactions for the mapping and automation of product, service and information flows. The research results illustrate the current cap of understanding information logistics integration in a digital business ecosystem. Based on success factors, we were able to illustrate how specific coordination mechanisms related to network management and orchestration could be designed. We also pointed out the potential of information logistics integration in value creation. With the help of global standardization experts, we utilized the design of the core information model for B2B integration. We built this quantitative analysis by using the Monte Carlo-based simulation model and the Real Option Value model. This research covers relevant new research disciplines, such as information logistics integration and digital business ecosystems, in which the current literature needs to be improved. This research was executed by high-level experts and managers responsible for global business network B2B integration. However, the research was dominated by one industry domain, and therefore a more comprehensive exploration should be undertaken to cover a larger population of business sectors. Based on this research, the new quantitative survey could provide new possibilities to examine information logistics integration in digital business ecosystems. The value activities indicate that further studies should continue, especially with regard to the collaboration issues on integration, focusing on a user-centric approach. We should better understand how real-time information supports customer value creation by imbedding the information into the lifetime value of products and services. The aim of this research was to build competitive advantage through B2B integration to support a real-time economy. For practitioners, this research created several tools and concepts to improve value activities, information logistics integration design and management and orchestration models. Based on the results, the companies were able to better understand the formulation of the digital business ecosystem and the importance of joint efforts in collaboration. However, the challenge of incorporating this new knowledge into strategic processes in a multi-stakeholder environment remains. This challenge has been noted, and new projects have been established in pursuit of a real-time economy.
Resumo:
Although echocardiography has been used in rats, few studies have determined its efficacy for estimating myocardial infarct size. Our objective was to estimate the myocardial infarct size, and to evaluate anatomic and functional variables of the left ventricle. Myocardial infarction was produced in 43 female Wistar rats by ligature of the left coronary artery. Echocardiography was performed 5 weeks later to measure left ventricular diameter and transverse area (mean of 3 transverse planes), infarct size (percentage of the arc with infarct on 3 transverse planes), systolic function by the change in fractional area, and diastolic function by mitral inflow parameters. The histologic measurement of myocardial infarction size was similar to the echocardiographic method. Myocardial infarct size ranged from 4.8 to 66.6% when determined by histology and from 5 to 69.8% when determined by echocardiography, with good correlation (r = 0.88; P < 0.05; Pearson correlation coefficient). Left ventricular diameter and mean diastolic transverse area correlated with myocardial infarct size by histology (r = 0.57 and r = 0.78; P < 0.0005). The fractional area change ranged from 28.5 ± 5.6 (large-size myocardial infarction) to 53.1 ± 1.5% (control) and correlated with myocardial infarct size by echocardiography (r = -0.87; P < 0.00001) and histology (r = -0.78; P < 00001). The E/A wave ratio of mitral inflow velocity for animals with large-size myocardial infarction (5.6 ± 2.7) was significantly higher than for all others (control: 1.9 ± 0.1; small-size myocardial infarction: 1.9 ± 0.4; moderate-size myocardial infarction: 2.8 ± 2.3). There was good agreement between echocardiographic and histologic estimates of myocardial infarct size in rats.
Resumo:
The reduction of greenhouse gas emissions in the European Union promotes the combustion of biomass rather than fossil fuels in energy production. Circulating fluidized bed (CFB) combustion offers a simple, flexible and efficient way to utilize untreated biomass in a large scale. CFB furnaces are modeled in order to understand their operation better and to help in the design of new furnaces. Therefore, physically accurate models are needed to describe the heavily coupled multiphase flow, reactions and heat transfer inside the furnace. This thesis presents a new model for the fuel flow inside the CFB furnace, which acknowledges the physical properties of the fuel and the multiphase flow phenomena inside the furnace. This model is applied with special interest in the firing of untreated biomass. An experimental method is utilized to characterize gas-fuel drag force relations. This characteristic drag force approach is developed into a gas-fuel drag force model suitable for irregular, non-spherical biomass particles and applied together with the new fuel flow model in the modeling of a large-scale CFB furnace. The model results are physically valid and achieve very good correspondence with the measurement results from large-scale CFB furnace firing biomass. With the methods and models presented in this work, the fuel flow field inside a circulating fluidized bed furnace can be modeled with better accuracy and more efficiently than in previous studies with a three-dimensional holistic model frame.
Resumo:
The present report describes the development of a technique for automatic wheezing recognition in digitally recorded lung sounds. This method is based on the extraction and processing of spectral information from the respiratory cycle and the use of these data for user feedback and automatic recognition. The respiratory cycle is first pre-processed, in order to normalize its spectral information, and its spectrogram is then computed. After this procedure, the spectrogram image is processed by a two-dimensional convolution filter and a half-threshold in order to increase the contrast and isolate its highest amplitude components, respectively. Thus, in order to generate more compressed data to automatic recognition, the spectral projection from the processed spectrogram is computed and stored as an array. The higher magnitude values of the array and its respective spectral values are then located and used as inputs to a multi-layer perceptron artificial neural network, which results an automatic indication about the presence of wheezes. For validation of the methodology, lung sounds recorded from three different repositories were used. The results show that the proposed technique achieves 84.82% accuracy in the detection of wheezing for an isolated respiratory cycle and 92.86% accuracy for the detection of wheezes when detection is carried out using groups of respiratory cycles obtained from the same person. Also, the system presents the original recorded sound and the post-processed spectrogram image for the user to draw his own conclusions from the data.
Resumo:
Solar and wind power produce electricity irregularly. This irregular power production is problematic and therefore production can exceed the need. Thus sufficient energy storage solutions are needed. Currently there are some storages, such as flywheel, but they are quite short-term. Power-to-Gas (P2G) offers a solution to store energy as a synthetic natural gas. It also improves nation’s energy self-sufficiency. Power-to-Gas can be integrated to an industrial or a municipal facility to reduce production costs. In this master’s thesis the integration of Power-to-Gas technologies to wastewater treatment as a part of the VTT’s Neo-Carbon Energy project is studied. Power-to-Gas produces synthetic methane (SNG) from water and carbon dioxide with electricity. This SNG can be considered as stored energy. Basic wastewater treatment technologies and the production of biogas in the treatment plant are studied. The utilisation of biogas and SNG in heat and power production and in transportation is also studied. The integration of the P2G to wastewater treatment plant (WWTP) is examined mainly from economic view. First the mass flows of flowing materials are calculated and after that the economic impact based on the mass flows. The economic efficiency is evaluated with Net Present Value method. In this thesis it is also studied the overall profitability of the integration and the key economic factors.
Integration of marketing research data in new product development. Case study: Food industry company
Resumo:
The aim of this master’s thesis is to provide a real life example of how marketing research data is used by different functions in the NPD process. In order to achieve this goal, a case study in a company was implemented where gathering, analysis, distribution and synthesis of marketing research data in NPD were studied. The main research question was formulated as follows: How is marketing research data integrated and used by different company functions in the NPD process? The theory part of the master’s thesis was focused on the discussion of the marketing function role in NPD, use of marketing research particularly in the food industry, as well as issues related to the marketing/R&D interface during the NPD process. The empirical part of the master’s thesis was based on qualitative explanatory case study research. Individual in-depth interviews with company representatives, company documents and online research were used for data collection and analyzed through triangulation method. The empirical findings advocate that the most important marketing data sources at the concept generation stage of NPD are: global trends monitoring, retailing audit and consumers insights. These data sets are crucial for establishing the potential of the product on the market and defining the desired features for the new product to be developed. The findings also suggest the example of successful crossfunctional communication during the NPD process with formal and informal communication patterns. General managerial recommendations are given on the integration in NPD of a strategy, process, continuous improvement, and motivated cross-functional product development teams.
Resumo:
Abstract An accurate, reliable and fast multianalyte/multiclass ultra-performance liquid chromatography–tandem mass spectrometry (UPLC–MS/MS) method was developed and validated for the simultaneous analysis of 23 pharmaceuticals, belonging to different classes amphenicols, sulfonamides, tetracyclines, in honey samples. The method developed consists of ultrasonic extraction followed by UPLC–ESI–MS/MS with electrospray ionization in both positive mode and negative mode. The influence of the extraction solvents and mobile phase composition on the sensitivity of the method, and the optimum conditions for sample weight and extraction temperature in terms of analyte recovery were extensively studied. The identification of antibiotics is fulfilled by simultaneous use of chromatographic separation using an Acquity BEH C18 (100 mm x 2.1 mm, 1.7 µm) analytical column with a gradient elution of mobile phases and tandem mass spectrometry with an electrospray ionization. Finally, the method developed was applied to the determination of target analytes in honey samples obtained from the local markets and several beekeepers in Muğla, Turkey. Ultrasonic-extraction of pharmaceuticals from honey samples is a well-established technique by UPLC–ESI–MS/MS, the uniqueness of this study lies in the simultaneous determination of a remarkable number of compounds belonging to 23 drug at the sub-nanogram per kilogram level.
Resumo:
Subshifts are sets of configurations over an infinite grid defined by a set of forbidden patterns. In this thesis, we study two-dimensional subshifts offinite type (2D SFTs), where the underlying grid is Z2 and the set of for-bidden patterns is finite. We are mainly interested in the interplay between the computational power of 2D SFTs and their geometry, examined through the concept of expansive subdynamics. 2D SFTs with expansive directions form an interesting and natural class of subshifts that lie between dimensions 1 and 2. An SFT that has only one non-expansive direction is called extremely expansive. We prove that in many aspects, extremely expansive 2D SFTs display the totality of behaviours of general 2D SFTs. For example, we construct an aperiodic extremely expansive 2D SFT and we prove that the emptiness problem is undecidable even when restricted to the class of extremely expansive 2D SFTs. We also prove that every Medvedev class contains an extremely expansive 2D SFT and we provide a characterization of the sets of directions that can be the set of non-expansive directions of a 2D SFT. Finally, we prove that for every computable sequence of 2D SFTs with an expansive direction, there exists a universal object that simulates all of the elements of the sequence. We use the so called hierarchical, self-simulating or fixed-point method for constructing 2D SFTs which has been previously used by Ga´cs, Durand, Romashchenko and Shen.
Resumo:
The goal of this study was to explore how do customers’ life-related negative emotions affect real estate business. This was divided into two research questions: 1. What life-related negative emotions can be recognised in real estate customer encounters? 2. How do the recognised emotions affect customer encounters and the realtor’s work? 3. How can the realtor take the emotions into account in customer service? The theoretical background consists of two main lines of study: emotions and customer encounters. A wide literary review on emotions research was conducted from a cognitive psychology point of view, focusing on negative emotions. Emotions research was then combined into the field of customer encounters. Qualitative study was chosen as the methodological basis of the study. Empirical material of this study was collected through in-depth interviews with 13 successful Finnish real estate agents. Narrative research was used as a method for the study. Four life-related emotion categories were recognized in real estate customer encounters: sadness, anger, anxiety and shame. These emotions rose from issues varying from death of a close one to divorce and from major changes in life stages to deep emotional attachment to an old home. The study also found that these incidental negative emotions do affect customer encounters and realtors’ work. The emotions affected the decision making of customers and sometimes overshadowed reason. Some emotions made the customer passive and slow to make any decisions, while others made their decision making fast and hasty. Even though the incidental emotions might not have had anything to do with the real estate deal, they could affect the outcome of the customer encounter and the whole real estate deal. Interestingly enough, the study found that not all successful real estate agents knowingly serve customers in an emotional level. The study does, however, suggest that in fact it may be an ethical decision of the customer server to take into account the emotional state of the customer. Attending to the emotional side of customers does not only increase pleasantness of the customer encounter, but may improve and balance customer decision making and prevent hasty decisions possibly leading to improved customer satisfaction. This study also gave practical managerial implications to customer service providers on how negative incidental emotions can be attended to in a customer encounter. This study could be useful not only to real estate agents, but also in other types of customer service, especially with vulnerable populations or other types of home-related business.
Resumo:
Adenoviral vectors are currently the most widely used gene therapeutic vectors, but their inability to integrate into host chromosomal DNA shortened their transgene expression and limited their use in clinical trials. In this project, we initially planned to develop a technique to test the effect of the early region 1 (E1) on adenovirus integration by comparing the integration efficiencies between an E1-deleted adenoviral vector (SubE1) and an Elcontaining vector (SubE3). However, we did not harvest any SubE3 virus, even if we repeated the transfection and successfully rescued the SubE1 virus (2/4 transfections generated viruses) and positive control virus (6/6). The failure of rescuing SubE3 could be caused by the instability of the genomic plasmid pFG173, as it had frequent intemal deletions when we were purifying It. Therefore, we developed techniques to test the effect of E1 on homologous recombination (HR) since literature suggested that adenovirus integration is initiated by HR. We attempted to silence the E1 in 293 cells by transfecting E1A/B-specific small interfering RNA (siRNA). However, no silenced phenotype was observed, even if we varied the concentrations of E1A/B siRNA (from 30 nM to 270 nM) and checked the silencing effects at different time points (48, 72, 96 h). One possible explanation would be that the E1A/B siRNA sequences are not potent enough to Induce the silenced phenotype. For evaluating HR efficiencies, an HR assay system based on bacterial transfonmatJon was designed. We constmcted two plasmids ( designated as pUC19-dl1 and pUC19-dl2) containing different defective lacZa cassettes (forming white colonies after transformation) that can generate a functional lacZa cassette (forming blue colonies) through HR after transfecting into 293 cells. The HR efficiencies would be expressed as the percentages of the blue colonies among all the colonies. Unfortunately, after transfonnation of plasmid isolated from 293 cells, no colony was found, even at a transformation efficiency of 1.8x10^ colonies/pg pUC19, suggesting the sensitivity of this system was low. To enhance the sensitivity, PCR was used. We designed a set of primers that can only amplify the recombinant plasmid fomied through HR. Therefore, the HR efficiencies among different treatments can be evaluated by the amplification results, and this system could be used to test the effect of E1 region on adenovirus integration. In addition, to our knowledge there was no previous studies using PCR/ Realtime PCR to evaluate HR efficiency, so this system also provides a PCR-based method to carry out the HR assays.
Resumo:
Part I - Fluorinated Compounds A method has been developed for the extraction, concentration, and determination of two unique fluorinated compounds from the sediments of Lake Ontario. These compounds originated from a common industrial landfill, and have been carried to Lake Ontario by the Niagara River. Sediment samples from the Mississauga basin of Lake Ontario have been evaluated for these compounds and a depositional trend was established. The sediments were extracted by accelerated solvent extraction (ASE) and then underwent clean-up, fractionation, solvent exchange, and were concentrated by reduction under nitrogen gas. The concentrated extracts were analyzed by gas chromatography - electron capture negative ionization - mass spectrometry. The depositional profile determined here is reflective of the operation of the landfill and shows that these compounds are still found at concentrations well above background levels. These increased levels have been attributed to physical disturbances of previously deposited contaminated sediments, and probable continued leaching from the dumpsite. Part II - Polycyclic Aromatic Hydrocarbons Gas chromatography/mass spectrometry is the most common method for the determination of polycyclic aromatic hydrocarbons (PAHs) from various matrices. Mass discrimination of high-boiling compounds in gas chromatographic methods is well known. The use of high-boiling injection solvents shows substantial increase in the response of late-eluting peaks. These solvents have an increased efficiently in the transfer of solutes from the injector to the analytical column. The effect of I-butanol, I-pentanol, cyclopentanol, I-hexanol, toluene and n-octane, as injection solvents, was studied. Higher-boiling solvents yield increased response for all PAHs. I -Hexanol is the best solvent, in terms of P AH response, but in this solvent P AHs were more susceptible to chromatographic problems such as peak splitting and tailing. Toluene was found to be the most forgiving solvent in terms of peak symmetry and response. It offered the smallest discrepancies in response, and symmetry over a wide range of initial column temperatures.
Resumo:
This thesis applies x-ray diffraction to measure he membrane structure of lipopolysaccharides and to develop a better model of a LPS bacterial melilbrane that can be used for biophysical research on antibiotics that attack cell membranes. \iVe ha'e Inodified the Physics department x-ray machine for use 3.'3 a thin film diffractometer, and have lesigned a new temperature and relative humidity controlled sample cell.\Ve tested the sample eel: by measuring the one-dimensional electron density profiles of bilayers of pope with 0%, 1%, 1G :VcJ, and 100% by weight lipo-polysaccharide from Pse'udo'lTwna aeTuginosa. Background VVe now know that traditional p,ntibiotics ,I,re losing their effectiveness against ever-evolving bacteria. This is because traditional antibiotic: work against specific targets within the bacterial cell, and with genetic mutations over time, themtibiotic no longer works. One possible solution are antimicrobial peptides. These are short proteins that are part of the immune systems of many animals, and some of them attack bacteria directly at the membrane of the cell, causing the bacterium to rupture and die. Since the membranes of most bacteria share common structural features, and these featuret, are unlikely to evolve very much, these peptides should effectively kill many types of bacteria wi Lhout much evolved resistance. But why do these peptides kill bacterial cel: '3 , but not the cells of the host animal? For gramnegative bacteria, the most likely reason is that t Ileir outer membrane is made of lipopolysaccharides (LPS), which is very different from an animal :;ell membrane. Up to now, what we knovv about how these peptides work was likely done with r !10spholipid models of animal cell membranes, and not with the more complex lipopolysa,echaricies, If we want to make better pepticies, ones that we can use to fight all types of infection, we need a more accurate molecular picture of how they \vork. This will hopefully be one step forward to the ( esign of better treatments for bacterial infections.
Resumo:
An abundant literature has demonstrated the benefits of empathy for intergroup relations (e.g., Batson, Chang, Orr, & Rowland, 2002). In addition, empathy has been identified as the mechanism by which various successful prejudice-reduction procedures impact attitudes and behaviour (e.g., Costello & Hodson, 2010). However, standard explicit techniques used in empathy-prejudice research have a number of potential limitations (e.g., resistance; McGregor, 1993). The present project explored an alternative technique, subliminally priming (i.e., outside of awareness) empathy-relevant terms (Study 1), or empathy itself (Study 2). Study 1 compared the effects of exposure to subliminal empathy-relevant primes (e.g., compassion) versus no priming and priming the opposite of empathy (e.g., indifference) on prejudice (i.e., negative attitudes), discrimination (i.e., resource allocation), and helping behaviour (i.e., willingness to empower, directly assist, or expect group change) towards immigrants. Relative to priming the opposite of empathy, participants exposed to primes of empathy-relevant constructs expressed less prejudice and were more willingness to empower immigrants. In addition, the effects were not moderated by individual differences in prejudice-relevant variables (i.e., Disgust Sensitivity, Intergroup Disgust-Sensitivity, Intergroup Anxiety, Social Dominance Orientation, Right-wing Authoritarianism). Study 2 considered a different target category (i.e., Blacks) and attempted to strengthen the effects found by comparing the impact of subliminal empathy primes (relative to no prime or subliminal primes of empathy paired with Blacks) on explicit prejudice towards marginalized groups and Blacks, willingness to help marginalized groups and Blacks, as well as implicit prejudice towards Blacks. In addition, Study 2 considered potential mechanisms for the predicted effects; specifically, general empathy, affective empathy towards Blacks, cognitive empathy towards Blacks, positive mood, and negative mood. Unfortunately, using subliminal empathy primes “backfired”, such that exposure to subliminal empathy primes (relative to no prime) heightened prejudice towards marginalized groups and Blacks, and led to stronger expectations that marginalized groups and Blacks improve their own situation. However, exposure to subliminal primes pairing empathy with Blacks (relative to subliminal empathy primes alone) resulted in less prejudice towards marginalized groups and more willingness to directly assist Blacks, as expected. Interestingly, exposure to subliminal primes of empathy paired with Blacks (vs. empathy alone) resulted in more pro-White bias on the implicit prejudice measure. Study 2 did not find that the potential mediators measured explained the effects found. Overall, the results of the present project do not provide strong support for the use of subliminal empathy primes for improving intergroup relations. In fact, the results of Study 2 suggest that the use of subliminal empathy primes may even backfire. The implications for intergroup research on empathy and priming procedures generally are discussed.
Resumo:
Depuis l’adoption de la politique de l’adaptation scolaire en 1999, le ministère de l’Éducation, du Loisir et du Sport a mis en place un plan d’action pour permettre l’intégration des élèves handicapés ou en difficulté d’adaptation ou d’apprentissage. L’intégration des élèves handicapés a fait l’objet de plusieurs études ; toutefois, peu d’entre elles se sont intéressées à l’intégration d’élèves présentant une déficience auditive et utilisant le langage parlé complété. Le but de cette étude est de mettre en lumière les perceptions d’élèves présentant une déficience auditive quant à l’utilisation du langage parlé complété en contexte d’intégration scolaire. Les concepts exploités dans ce mémoire sont relatifs à la surdité, aux élèves présentant une déficience auditive, aux modes de communication utilisés auprès de ces élèves, et plus particulièrement le langage parlé complété, de même qu’au contexte d’intégration scolaire québécois. Cette recherche est de type exploratoire et la méthode utilisée est qualitative. Cinq étudiants présentant une déficience auditive et âgés entre 12 et 17 ans ont participé à une entrevue semi-dirigée. Les résultats de ces entretiens montrent que ces élèves ont des perceptions positives et négatives face à l’utilisation du langage parlé complété en contexte d’intégration. Quoique de façon générale cela n’entrave pas leur intégration scolaire, les perceptions négatives se rapportent davantage à l’intégration sociale qu’à l’intégration scolaire.