934 resultados para Embedded derivative
Resumo:
Globalization and interconnectedness in the worldwide sphere have changed the existing and prevailing modus operandi of organizations around the globe and have challenged existing practices along with the business as usual mindset. There are no rules in terms of creating a competitive advantage and positioning within an unstable, constantly changing and volatile globalized business environment. The financial industry, the locomotive or the flagship industry of global economy, especially, within the aftermath of the financial crisis, has reached a certain point trying to recover and redefine its strategic orientation and positioning within the global business arena. Innovation has always been a trend and a buzzword and by many has been considered as the ultimate answer to any kind of problem. The mantra Innovate or Die has been prevailing in any organizational entity in a, sometimes, ruthless endeavour to develop cutting-edge products and services and capture a landmark position in the market. The emerging shift from a closed to an open innovation paradigm has been considered as new operational mechanism within the management and leadership of the company of the future. To that respect, open innovation has been experiencing a tremendous growth research trajectory by putting forward a new way of exchanging and using surplus knowledge in order to sustain innovation within organizations and in the level of industry. In the abovementioned reality, there seems to be something missing: the human element. This research, by going beyond the traditional narratives for open innovation, aims at making an innovative theoretical and managerial contribution developed and grounded on the on-going discussion regarding the individual and organizational barriers to open innovation within the financial industry. By functioning across disciplines and researching out to primary data, it debunks the myth that open innovation is solely a knowledge inflow and outflow mechanism and sheds light to the understanding on the why and the how organizational open innovation works by enlightening the broader dynamics and underlying principles of this fascinating paradigm. Little attention has been given to the role of the human element, the foundational pre-requisite of trust encapsulated within the precise and fundamental nature of organizing for open innovation, the organizational capabilities, the individual profiles of open innovation leaders, the definition of open innovation in the realms of the financial industry, the strategic intent of the financial industry and the need for nurturing a societal impact for human development. To that respect, this research introduces the trust-embedded approach to open innovation as a new insightful way of organizing for open innovation. It unveils the peculiarities of the corporate and individual spheres that act as a catalyst towards the creation of productive open innovation activities. The incentive of this research captures the fundamental question revolving around the need for financial institutions to recognise the importance for organizing for open innovation. The overarching question is why and how to create a corporate culture of openness in the financial industry, an organizational environment that can help open innovation excel. This research shares novel and cutting edge outcomes and propositions both under the prism of theory and practice. The trust-embedded open innovation paradigm captures the norms and narratives around the way of leading open innovation within the 21st century by cultivating a human-centricity mindset that leads to the creation of human organizations, leaving behind the dehumanization mindset currently prevailing within the financial industry.
Resumo:
Many, if not all, aspects of our everyday lives are related to computers and control. Microprocessors and wireless communications are involved in our lives. Embedded systems are an attracting field because they combine three key factors, small size, low power consumption and high computing capabilities. The aim of this thesis is to study how Linux communicates with the hardware, to answer the question if it is possible to use an operating system like Debian for embedded systems and finally, to build a Mechatronic real time application. In the thesis a presentation of Linux and the Xenomai real time patch is given, the bootloader and communication with the hardware is analyzed. BeagleBone the evaluation board is presented along with the application project consisted of a robot cart with a driver circuit, a line sensor reading a black line and two Xbee antennas. It makes use of Xenomai threads, the real time kernel. According to the obtained results, Linux is able to operate as a real time operating system. The issue of future research is the area of embedded Linux is also discussed.
Resumo:
Investing in mutual funds has become more popular than ever and the amount of money invested in mutual funds registered in Finland has hit its all-time high. Mutual funds provide a relatively low-cost method for private investors to invest in stock market and achieve diversified portfolios. In finance there is always a tradeoff between risk and return, where higher expected returns can usually be achieved only by taking higher risks. Diversifying the portfolio gets rid some of the risk but systematic risk cannot be diversified away. These risks can be managed by hedging the investments with derivatives. The use of derivatives should improve the performance of the portfolios using them compared to the funds that don’t. However, previous studies have shown that the risk exposure and return performance of derivative users does not considerably differ from nonusers. The purpose of this study is to examine how the use of derivatives affects the performance of equity funds. The funds studied were 155 equity funds registered in Finland in 2013. Empirical research was done by studying the derivative use of the funds during a 6-year period between 2008–2013. The performance of the funds was studied quantitatively by using several different performance measures used in mutual fund industry; Sharpe Ratio, Treynor Ratio, Jensen's alpha, Sortino Ratio, M2 and Omega Ratio. The effect of derivative use on funds' performance was studied by using a dummy variable and comparing performance measures of derivative-users and nonusers. The differences in performance measures between the two groups were analyzed with statistical tests. The hypothesis was that funds' derivative use should improve their performance relative to the funds that don't use them. The results of this study are in line with previous studies that state that the use of derivatives does not improve mutual funds' performance. When performance was measured with Jensen's alpha, funds that did not use derivatives performed better than the ones that used them. When measured with other performance measures, the results didn’t differ between two groups.
Resumo:
The objective of the study is to extend the existing hedging literature of the commodity price risks by investigating what kind of hedging strategies can be used in companies using bitumen as raw material in their production. Five different alternative swap hedging strategies in bitumen markets are empirically tested. Strategies tested are full hedge strategy, simple, conservative, and aggressive term structure strategies, and implied volatility strategy. The effectiveness of the alternative strategies is measured by excess returns compared to no hedge strategy. In addition, the downside risk of each strategy is measured with target absolute semi-deviation. Results indicate that any of the tested strategies does not outperform the no hedge strategy in terms of excess returns in all maturities. The best-performing aggressive term structure strategy succeeds to create positive excess returns only in short maturities. However, risk seems to increase hand-in-hand with the excess returns so that the best-performing strategies get the highest risk metrics as well. This implicates that the company willing to gain from favorable price movements must be ready to bear a greater risk. Thus, no superior hedging strategy over the others is found.
Resumo:
By employing the embedded-atom potentials of Mei et ai.[l], we have calculated the dynamical matrices and phonon dispersion curves for six fee metals (Cu,Ag,Au,Ni,Pd and Pt). We have also investigated, within the quasiharmonic approximation, some other thermal properties of these metals which depend on the phonon density of states, such as the temperature dependence of lattice constant, coefficient of linear thermal expansion, isothermal and adiabatic bulk moduli, heat capacities at constant volume and constant pressure, Griineisen parameter and Debye temperature. The computed results are compared with the experimental findings wherever possible. The comparison shows a generally good agreement between the theoretical values and experimental data for all properties except the discrepancies of phonon frequencies and Debye temperature for Pd, Pt and Au. Further, we modify the parameters of this model for Pd and Pt and obtain the phonon dispersion curves which is in good agreement with experimental data.
Resumo:
Now, more than ever, sponsors of athletic events demand to see evidence of a commercial return, such as enhanced brand awareness, for their investment of cash or non-cash resources (Lough et aI., 2000). The most common way to measure the impact of perimeter signage (Le., any billboard or sign that displays a company's brand name and/or logo and which surrounds the playing area) on spectators' awareness of event sponsors has been through the use of brand name recall and recognition tests (Shilbury & Berriman, 1996). Recall testing requires spectators to list all of the sponsors they can remember seeing at, for example, an athletic event, strictly from memory and without any help (Cuneen & Hannan, 1993). With recognition testing, spectators are required to identify sponsors from a prepared list which include "dummy" brand names (i.e., sponsors that are present in the list but which do not actually sponsor the event). In order to determine whether sponsors' brand awareness objectives are being met, it is important for sport and recreation marketers to understand what influences a spectator's ability to remember (Le., recall and/or recognize) the brand names of companies who advertise on perimeter signage. The purpose this study was to examine the factors that influence spectators' recall and recognition of embedded sponsorship stimuli (i.e., company brand names on perimeter signage surrounding the play area) at a Canadian University's men's basketball game and football game. These factors included the number of games spectators attended over the course of the season (i.e., repeated exposure to sponsorship stimuli), spectators' level of involvement with the event, and spectators' level of involvement with the advertisements (i.e., perimeter signage). This study also examined the differences between recall and recognition as a means of measuring spectators' awareness of sponsors, and attempted to determine if there are sport differences in spectators' recall and recognition of perimeter signage. Upon leaving the football stadium or gymnasium, spectators were approached, at random, by trained research assistants located at each exit and asked to complete a brief survey questionnaire. Respondents completed the survey on-site. A total of 358 completed surveys were collected from spectators who attended the football (N = 277) and basketball (N = 81) games. The data suggest that football and basketball respondents recognized more sponsors' brand names than they recalled. In addition, football respondents who were highly involved with the event (i.e., those individuals who viewed attending the events as fun, interesting and exciting) attended more games over the course of the season and had significantly higher brand name recognition of sponsors who advertised on perimeter signage than those individuals with low involvement with the athletic event. Football respondents who were highly involved with the sponsors' advertisements (i.e., those individuals who viewed sponsors' perimeter signage as appealing, valuable and important) had significantly higher brand name recall of event sponsors than those individuals with low involvement with these sponsors' advertisements. Repeated exposure to perimeter signage did not have a significant influence on football or basketball respondents' recall or recognition of sponsors. Finally, the data revealed that football respondents had significantly higher recall of sponsors' brand names than basketball respondents. Conversely, basketball respondents had significantly higher recognition of sponsors' brand names than did football respondents.
Resumo:
The potential of formative assessment (FA) for informing learning in classroom-based nursing courses is clearly established in the literature; however, research on FA in clinical courses remains scarce. This inquiry explored the lived experience of nursing students using transcendental phenomenology and described the phenomenon of being assessed in clinical courses. The research question guiding the study was: How is the phenomenon of assessment experienced by nursing students when FA is formally embedded in clinical courses? Inherent in this question were the following issues: (a) the meaning of clinical experiences for nursing students, (b) the meaning of being assessed through FA, and (c) what it is like to be assessed when FA is formally embedded within clinical experiences. The noematic themes that illuminated the whatness of the participants’ experience were (a) enabled cognitive activity, (b) useful feedback, (c) freedom to be, (d) enhanced focus, (e) stress moderator, and (f) respectful mentorship. The noetic themes associated with how the phenomenon was experienced were related to bodyhood, temporality, spatiality, and relationship to others. The results suggest a fundamental paradigm shift from traditional nursing education to a more pervasive integration of FA in clinical courses so that students have time to learn before being graded on their practice. Furthermore, this inquiry and the literature consulted provide evidence that using cognitive science theory to inform and reform clinical nursing education is a timely option to address the repeated calls from nursing leaders to modernize nursing education. This inquiry contributes to reduce our reliance on assumptions derived from research on FA in nursing classrooms and provides evidence based on the reality of using formative assessment in clinical courses. Recommendations for future research are presented.
Resumo:
Une des façons d’approcher la question de l’existence de raisons partiales non-dérivatives d’une quelconque sorte consiste à expliquer ce que sont les raisons partiales et ensuite à chercher à savoir s’il y a des raisons de cette sorte. Si de telles raisons existent, alors il est au moins possible qu’il y ait des raisons partiales d’amitié. C’est cette approche que j’adopterai ici, et elle produit des résultats intéressants. Le premier a trait à la structure des raisons partiales. C’est au moins une condition nécessaire pour qu’une raison soit partiale qu’elle aie une composante relationnelle explicite. Cette composante, techniquement parlant, est un relatum dans la relation d’être une raison qui elle-même est une relation entre la personne à qui la raison s’applique et la personne concernée par l’action pour laquelle il y a une raison. La deuxième conclusion de ce texte est que cette composante relationnelle est aussi requise dans de nombreuses sortes de raisons admises comme impartiales. Afin d’éviter de banaliser la distinction entre raisons partiales et impartiales nous devons appliquer une condition suffisante additionnelle. Finalement, bien qu’il pourrait s’avérer possible de distinguer les raisons impartiales ayant une composante relationnelle des raisons partiales, cette approche suggère que la question de savoir si l’éthique est partiale ou impartiale devra se régler au niveau de l’éthique normative, ou à tout le moins, qu’elle ne pourra se régler au niveau du discours sur la nature des raisons d’agir.
Resumo:
Mémoire numérisé par la Division de la gestion de documents et des archives de l'Université de Montréal
Resumo:
new PVC membrane ion selective electrode which is highly selective towards Ni(II) ions was constructed using a Schiff base containing a binaphthyl moiety as the ionophore. The sensor exhibited a good Nernstian response for nickel ions over the concentration range 1.0 × 10–1 – 5.0 × 10–6 M with a lower limit of detection of 1.3 × 10–6 M. It has a fast response time and can be used for a period of 4 months with a good reproducibility. The sensor is suitable for use in aqueous solutions in a wide pH range of 3.6 – 7.4 and works satisfactorily in the presence of 25% (v/v) methanol or ethanol. The sensor shows high selectivity to nickel ions over a wide variety of cations. It has been successfully used as an indicator electrode in the potentiometric titration of nickel ions against EDTA and also for the direct determination of nickel content in real samples: effluent samples, chocolates and hydrogenated oils.
Resumo:
Low power optical phase conjugation in polyvinyl alcohol films embedded with saturable dyes is reported. Phase conjugate reflectivity achieved is higher than that obtained in the case of similar gelatin films.
Resumo:
Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.
Resumo:
In this paper, we have evolved a generic software architecture for a domain specific distributed embedded system. The system under consideration belongs to the Command, Control and Communication systems domain. The systems in such domain have very long operational lifetime. The quality attributes of these systems are equally important as the functional requirements. The main guiding principle followed in this paper for evolving the software architecture has been functional independence of the modules. The quality attributes considered most important for the system are maintainability and modifiability. Architectural styles best suited for the functionally independent modules are proposed with focus on these quality attributes. The software architecture for the system is envisioned as a collection of architecture styles of the functionally independent modules identified