41 resultados para Moduli in modern mapping theory

em Doria (National Library of Finland DSpace Services) - National Library of Finland, Finland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Kirjallisuusarvostelu

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Selostus: Maanviljelijöiden altistuminen pölyille ja kaasuille nykyaikaisissa navetoissa

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Palvelukehitystoiminta sitoo huomattavan määrän resursseja ja on pitkäkestoista toimintaa. Innovatiivisuuteen tähtäämällä ja systemaattisella tuotekehitystyöllä yritys parantaa jatkuvuutta omassa liiketoiminnassaan. Alusta-ajattelu tuo uuden ulottuvuuden tuotteiden ja palveluiden kehitykseen. Alustan kehittäminen tukemaan tuote- ja palvelukehitystoimintaa ja yksinkertaistamaan tuote/palvelurakenteita antaa yrityksissä lisäpotentiaalia esimerkiksi lyhentyneiden kehitysaikojen, paremman kompleksisuuden hallinnan ja kustannustehokkuuden nousun myötä. Toimintojen tehostuminen yritystasolla saa aikaan mahdollisuuksien lisääntymisen nykyisillä liiketoimintasektoreilla. Palvelualustan kehityksellä päästään palvelurakenteen mallintamisen kautta parempaan liiketoiminnan hallitsemiseen ja systemaattisempaan tuotekehityksen läpivientiin. Palvelualustan yhtenä tärkeimpänä hyötynä on, että palvelun rakenteellisuus saadaan kuvattua alustaan. Lisäksi on tärkeää määritellä vastuutukset alustan kehityksessä, sekä pystyä mallintamaan informaation kulku (rajapinnat) prosesseissa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tutkielmassa eritellään Norman Faircloughin kriittisen diskurssianalyysin teoriaa ja siihen kohdistettua kritiikkiä. Pyrkimyksenä on sovittaa näitä erilaisia näkemyksiä keskenään ja tarjota ratkaisuja yhteen kiriittisen diskurssianalyysin keskeiseen ongelmaan eli emansipaation (sosiaalisten epäkohtien tunnistamisen ja ratkaisemisen) puutteellisuuteen. Teoriaosuudesta esiin nousevia mahdollisuuksia sovelletaan tekstianalyysiin. Tutkimuksen kohteena on teksti Rebuilding America’s Defenses: Strategy, Forces and Resources For a New Century ja jossain määrin sen tuottanut järjestö Project for the New American Century. Näitä tarkastellaan ennen kaikkea sosiaalisina ilmiöinä ja suhteessa toisiinsa. Faircloughin mallin suurimmiksi ongelmiksi muodostuvat perinteinen käsitys kielestä, jonka mukaan kielen järjestelmän abstraktit ja sisäiset suhteet ovat tärkeimpiä, sekä ideologinen vastakkainasettelu kritiikin lähtökohtana. Ensimmäinen johtaa kielellisten tutkimustulosten epätyydyttävään kykyyn selittää sosiaalisia havaintoja ja jälkimmäinen poliittiseen tai maailmankatsomukselliseen väittelyyn, joka ei mahdollista uusia näkemyksiä. Tutkielman lopputulema on, että keskittymällä asiasisältöön kielen rakenteen sijasta ja ymmärtämällä tekstin tuottaja yksittäisenä, rajattuna sosiaalisena toimijana voidaan analyysiin saada avoimuutta ja täsmällisyyttä. Kriittiinen diskurssianalyysi kaipaa tällaista näkemystä kielellisten analyysien tueksi ja uudenlaisen relevanssin löytääkseen.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work, the feasibility of the floating-gate technology in analog computing platforms in a scaled down general-purpose CMOS technology is considered. When the technology is scaled down the performance of analog circuits tends to get worse because the process parameters are optimized for digital transistors and the scaling involves the reduction of supply voltages. Generally, the challenge in analog circuit design is that all salient design metrics such as power, area, bandwidth and accuracy are interrelated. Furthermore, poor flexibility, i.e. lack of reconfigurability, the reuse of IP etc., can be considered the most severe weakness of analog hardware. On this account, digital calibration schemes are often required for improved performance or yield enhancement, whereas high flexibility/reconfigurability can not be easily achieved. Here, it is discussed whether it is possible to work around these obstacles by using floating-gate transistors (FGTs), and analyze problems associated with the practical implementation. FGT technology is attractive because it is electrically programmable and also features a charge-based built-in non-volatile memory. Apart from being ideal for canceling the circuit non-idealities due to process variations, the FGTs can also be used as computational or adaptive elements in analog circuits. The nominal gate oxide thickness in the deep sub-micron (DSM) processes is too thin to support robust charge retention and consequently the FGT becomes leaky. In principle, non-leaky FGTs can be implemented in a scaled down process without any special masks by using “double”-oxide transistors intended for providing devices that operate with higher supply voltages than general purpose devices. However, in practice the technology scaling poses several challenges which are addressed in this thesis. To provide a sufficiently wide-ranging survey, six prototype chips with varying complexity were implemented in four different DSM process nodes and investigated from this perspective. The focus is on non-leaky FGTs, but the presented autozeroing floating-gate amplifier (AFGA) demonstrates that leaky FGTs may also find a use. The simplest test structures contain only a few transistors, whereas the most complex experimental chip is an implementation of a spiking neural network (SNN) which comprises thousands of active and passive devices. More precisely, it is a fully connected (256 FGT synapses) two-layer spiking neural network (SNN), where the adaptive properties of FGT are taken advantage of. A compact realization of Spike Timing Dependent Plasticity (STDP) within the SNN is one of the key contributions of this thesis. Finally, the considerations in this thesis extend beyond CMOS to emerging nanodevices. To this end, one promising emerging nanoscale circuit element - memristor - is reviewed and its applicability for analog processing is considered. Furthermore, it is discussed how the FGT technology can be used to prototype computation paradigms compatible with these emerging two-terminal nanoscale devices in a mature and widely available CMOS technology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This PhD thesis in Mathematics belongs to the field of Geometric Function Theory. The thesis consists of four original papers. The topic studied deals with quasiconformal mappings and their distortion theory in Euclidean n-dimensional spaces. This theory has its roots in the pioneering papers of F. W. Gehring and J. Väisälä published in the early 1960’s and it has been studied by many mathematicians thereafter. In the first paper we refine the known bounds for the so-called Mori constant and also estimate the distortion in the hyperbolic metric. The second paper deals with radial functions which are simple examples of quasiconformal mappings. These radial functions lead us to the study of the so-called p-angular distance which has been studied recently e.g. by L. Maligranda and S. Dragomir. In the third paper we study a class of functions of a real variable studied by P. Lindqvist in an influential paper. This leads one to study parametrized analogues of classical trigonometric and hyperbolic functions which for the parameter value p = 2 coincide with the classical functions. Gaussian hypergeometric functions have an important role in the study of these special functions. Several new inequalities and identities involving p-analogues of these functions are also given. In the fourth paper we study the generalized complete elliptic integrals, modular functions and some related functions. We find the upper and lower bounds of these functions, and those bounds are given in a simple form. This theory has a long history which goes back two centuries and includes names such as A. M. Legendre, C. Jacobi, C. F. Gauss. Modular functions also occur in the study of quasiconformal mappings. Conformal invariants, such as the modulus of a curve family, are often applied in quasiconformal mapping theory. The invariants can be sometimes expressed in terms of special conformal mappings. This fact explains why special functions often occur in this theory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Decisions taken in modern organizations are often multi-dimensional, involving multiple decision makers and several criteria measured on different scales. Multiple Criteria Decision Making (MCDM) methods are designed to analyze and to give recommendations in this kind of situations. Among the numerous MCDM methods, two large families of methods are the multi-attribute utility theory based methods and the outranking methods. Traditionally both method families require exact values for technical parameters and criteria measurements, as well as for preferences expressed as weights. Often it is hard, if not impossible, to obtain exact values. Stochastic Multicriteria Acceptability Analysis (SMAA) is a family of methods designed to help in this type of situations where exact values are not available. Different variants of SMAA allow handling all types of MCDM problems. They support defining the model through uncertain, imprecise, or completely missing values. The methods are based on simulation that is applied to obtain descriptive indices characterizing the problem. In this thesis we present new advances in the SMAA methodology. We present and analyze algorithms for the SMAA-2 method and its extension to handle ordinal preferences. We then present an application of SMAA-2 to an area where MCDM models have not been applied before: planning elevator groups for high-rise buildings. Following this, we introduce two new methods to the family: SMAA-TRI that extends ELECTRE TRI for sorting problems with uncertain parameter values, and SMAA-III that extends ELECTRE III in a similar way. An efficient software implementing these two methods has been developed in conjunction with this work, and is briefly presented in this thesis. The thesis is closed with a comprehensive survey of SMAA methodology including a definition of a unified framework.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tutkimus keskittyy kansainväliseen hajauttamiseen suomalaisen sijoittajan näkökulmasta. Tutkimuksen toinen tavoite on selvittää tehostavatko uudet kovarianssimatriisiestimaattorit minimivarianssiportfolion optimointiprosessia. Tavallisen otoskovarianssimatriisin lisäksi optimoinnissa käytetään kahta kutistusestimaattoria ja joustavaa monimuuttuja-GARCH(1,1)-mallia. Tutkimusaineisto koostuu Dow Jonesin toimialaindekseistä ja OMX-H:n portfolioindeksistä. Kansainvälinen hajautusstrategia on toteutettu käyttäen toimialalähestymistapaa ja portfoliota optimoidaan käyttäen kahtatoista komponenttia. Tutkimusaieisto kattaa vuodet 1996-2005 eli 120 kuukausittaista havaintoa. Muodostettujen portfolioiden suorituskykyä mitataan Sharpen indeksillä. Tutkimustulosten mukaan kansainvälisesti hajautettujen investointien ja kotimaisen portfolion riskikorjattujen tuottojen välillä ei ole tilastollisesti merkitsevää eroa. Myöskään uusien kovarianssimatriisiestimaattoreiden käytöstä ei synnytilastollisesti merkitsevää lisäarvoa verrattuna otoskovarianssimatrisiin perustuvaan portfolion optimointiin.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The major objective of this thesis is to describe and analyse how a railcarrier is engaged in an intermodal freight transportation network through its role and position. Because of the fact that the role as a conceptualisation has a lot of parallels with the position, both these phenomena are evaluated theoretically and empirically. VR Cargo (a strategical business unitof the Finnish railway company VR Ltd.) was chosen to be the focal firm surrounded by the actors of the focal net. Because of the fact that networks are sets of relationships rather than sets of actors, it is essential to describe the dimensions of the relationships created through the time thus having a past, presentand future. The roles are created during long common history shared by the actors especially when IM networks are considered. The presence of roles is embeddedin the tasks, and the future is anchored to the expectations. Furthermore, in this study role refers to network dynamics, and to incremental and radical changes in the network, in a similar way as position refers to stability and to the influences of bonded structures. The main purpose of the first part of the study was to examine how the two distinctive views that have a dominant position in modern logistics ¿ the network view (particularly IMP-based network approach) and the managerial view (represented by Supply Chain Management) differ, especially when intermodalism is under consideration. In this study intermodalism was defined as a form of interorganisational behaviour characterized by the physical movement of unitized goods with Intermodal Transport Units, using more than one mode as performed by the net of operators. In this particular stage the study relies mainly on theoretical evaluation broadened by some discussions with the practitioners. This is essential, because the continuous dialogue between theory and practice is highly emphasized. Some managerial implications are discussed on the basis of the theoretical examination. A tentative model for empirical analysis in subsequent research is suggested. The empirical investigation, which relies on the interviews among the members in the focal net, shows that the major role of the focal company in the network is the common carrier. This role has some behavioural and functional characteristics, such as an executive's disclosure expressing strategic will attached with stable and predictable managerial and organisational behaviour. Most important is the notion that the focal company is neutral for all the other operators, and willing to enhance and strengthen the collaboration with all the members in the IM network. This also means that all the accounts are aimed at being equal in terms of customer satisfaction. Besides, the adjustments intensify the adopted role. However, the focal company is also obliged tosustain its role as it still has a government-erected right to maintain solely the railway operations on domestic tracks. In addition, the roles of a dominator, principal, partner, subcontractor, and integrator were present appearing either in a dyadic relationship or in net(work) context. In order to reveal differentroles, a dualistic interpretation of the concept of role/position was employed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The uncertainty of any analytical determination depends on analysis and sampling. Uncertainty arising from sampling is usually not controlled and methods for its evaluation are still little known. Pierre Gy’s sampling theory is currently the most complete theory about samplingwhich also takes the design of the sampling equipment into account. Guides dealing with the practical issues of sampling also exist, published by international organizations such as EURACHEM, IUPAC (International Union of Pure and Applied Chemistry) and ISO (International Organization for Standardization). In this work Gy’s sampling theory was applied to several cases, including the analysis of chromite concentration estimated on SEM (Scanning Electron Microscope) images and estimation of the total uncertainty of a drug dissolution procedure. The results clearly show that Gy’s sampling theory can be utilized in both of the above-mentioned cases and that the uncertainties achieved are reliable. Variographic experiments introduced in Gy’s sampling theory are beneficially applied in analyzing the uncertainty of auto-correlated data sets such as industrial process data and environmental discharges. The periodic behaviour of these kinds of processes can be observed by variographic analysis as well as with fast Fourier transformation and auto-correlation functions. With variographic analysis, the uncertainties are estimated as a function of the sampling interval. This is advantageous when environmental data or process data are analyzed as it can be easily estimated how the sampling interval is affecting the overall uncertainty. If the sampling frequency is too high, unnecessary resources will be used. On the other hand, if a frequency is too low, the uncertainty of the determination may be unacceptably high. Variographic methods can also be utilized to estimate the uncertainty of spectral data produced by modern instruments. Since spectral data are multivariate, methods such as Principal Component Analysis (PCA) are needed when the data are analyzed. Optimization of a sampling plan increases the reliability of the analytical process which might at the end have beneficial effects on the economics of chemical analysis,

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The focus in this thesis is to study both technical and economical possibilities of novel on-line condition monitoring techniques in underground low voltage distribution cable networks. This thesis consists of literature study about fault progression mechanisms in modern low voltage cables, laboratory measurements to determine the base and restrictions of novel on-line condition monitoring methods, and economic evaluation, based on fault statistics and information gathered from Finnish distribution system operators. This thesis is closely related to master’s thesis “Channel Estimation and On-line Diagnosis of LV Distribution Cabling”, which focuses more on the actual condition monitoring methods and signal theory behind them.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis discusses the basic problem of the modern portfolio theory about how to optimise the perfect allocation for an investment portfolio. The theory provides a solution for an efficient portfolio, which minimises the risk of the portfolio with respect to the expected return. A central feature for all the portfolios on the efficient frontier is that the investor needs to provide the expected return for each asset. Market anomalies are persistent patterns seen in the financial markets, which cannot be explained with the current asset pricing theory. The goal of this thesis is to study whether these anomalies can be observed among different asset classes. Finally, if persistent patterns are found, it is investigated whether the anomalies hold valuable information for determining the expected returns used in the portfolio optimization Market anomalies and investment strategies based on them are studied with a rolling estimation window, where the return for the following period is always based on historical information. This is also crucial when rebalancing the portfolio. The anomalies investigated within this thesis are value, momentum, reversal, and idiosyncratic volatility. The research data includes price series of country level stock indices, government bonds, currencies, and commodities. The modern portfolio theory and the views given by the anomalies are combined by utilising the Black-Litterman model. This makes it possible to optimise the portfolio so that investor’s views are taken into account. When constructing the portfolios, the goal is to maximise the Sharpe ratio. Significance of the results is studied by assessing if the strategy yields excess returns in a relation to those explained by the threefactormodel. The most outstanding finding is that anomaly based factors include valuable information to enhance efficient portfolio diversification. When the highest Sharpe ratios for each asset class are picked from the test factors and applied to the Black−Litterman model, the final portfolio results in superior riskreturn combination. The highest Sharpe ratios are provided by momentum strategy for stocks and long-term reversal for the rest of the asset classes. Additionally, a strategy based on the value effect was highly appealing, and it basically performs as well as the previously mentioned Sharpe strategy. When studying the anomalies, it is found, that 12-month momentum is the strongest effect, especially for stock indices. In addition, a high idiosyncratic volatility seems to be positively correlated with country indices on stocks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Radanohjaimia käytetään radan poikkisuuntaiseen paikoitukseen sekä oskillointiin ja ne ovat osa paperinvalmistuslinjojen radanhallintajärjestelmiä. Oikein suunnitellulla ja sijoitetulla radanohjaimella voidaan estää haitallisten ratasiirtymien esiintyminen, pienentää hylkymateriaalin määrää sekä parantaa rullauksen laatua. Ohjausperiaatteet sekä niihin liittyvä laitetekniikka on tunnettu jo vuosikymmenten ajan, joskin todelliset radanohjaustarpeet ja -laitteet ovat tähän asti rajoittuneet paino-, tekstiili- ja metalliteollisuudessa käytössä oleville kapeille radoille. Tämän työn tavoitteena oli selvittää 5… 12 metriä leveälle, modernissa päällystyskoneessa kulkevalle paperiradalle soveltuvat ohjausperiaatteet sekä muodostaa radan todennäköisimpiin vauriomuotoihin perustuvat suunnittelukriteerit kahdelle erityyppiselle ohjainkonstruktiolle. Yksitelainen, taivuttava radanohjain soveltuu kohteisiin, joissa ohjainta edeltää suhteellisen pitkä vapaa ratavienti. Kaksitelainen, taittava radanohjain on puolestaan sijoitettavissa huomattavasti lyhyempään ratavientiin. Radanohjauksen teoria pohjautuu pitävän telan ja radan väliseen kohtisuoran tulokulman periaatteeseen, jonka perusteella mikä tahansa yhdensuuntaisuuspoikkeama kahden telan välillä johtaa radan poikkisuuntaiseen siirtymään. Tämän periaatteen pohjalta voidaan dynaamista ohjaustilannetta approksimoida staattisin menetelmin sekä muodostaa kireysmuutoksiin ja aaltoiluun perustuvat geometrian mitoitusperiaatteet ohjaimen ympäristön ratavienneille. Nopean ohjauksen toteutus edellyttää radanohjaimen liikkeen olevan nivelen ympäri tapahtuvaa yhdistettyä translaatio- ja rotaatioliikettä. Ohjainkonstruktiot suunnitellaan siten, että teorian mukaiset optimaaliset liikeradat toteutuvat vaaditulla ohjausnopeudella. Suunnittelua ohjaavat tuotteille asetetut lujuus- ja värähtelykriteerit sekä aiheeseen liittyvät koneturvallisuusstandardit. Konstruointi suoritetaan järjestelmällisen tuotesuunnitteluprosessin vaiheiden ja menetelmien mukaisesti.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is axiomatic that our planet is extensively inhabited by diverse micro-organisms such as bacteria, yet the absolute diversity of different bacterial species is widely held to be unknown. Different bacteria can be found from the depths of the oceans to the top of the mountains; even the air is more or less colonized by bacteria. Most bacteria are either harmless or even advantageous to human beings but there are also bacteria, which can cause severe infectious diseases or spoil the supplies intended for human consumption. Therefore, it is vitally important not only to be able to detect and enumerate bacteria but also to assess their viability and possible harmfulness. Whilst the growth of bacteria is remarkably fast under optimum conditions and easy to detect by cultural methods, most bacteria are believed to lie in stationary phase of growth in which the actual growth is ceased and thus bacteria may simply be undetectable by cultural techniques. Additionally, several injurious factors such as low and high temperature or deficiency of nutrients can turn bacteria into a viable but non-culturable state (VBNC) that cannot be detected by cultural methods. Thereby, various noncultural techniques developed for the assessment of bacterial viability and killing have widely been exploited in modern microbiology. However, only a few methods are suitable for kinetic measurements, which enable the real-time detection of bacterial growth and viability. The present study describes alternative methods for measuring bacterial viability and killing as well as detecting the effects of various antimicrobial agents on bacteria on a real-time basis. The suitability of bacterial (lux) and beetle (luc) luciferases as well as green fluorescent protein (GFP) to act as a marker of bacterial viability and cell growth was tested. In particular, a multiparameter microplate assay based on GFP-luciferase combination as well as a flow cytometric measurement based on GFP-PI combination were developed to perform divergent viability analyses. The results obtained suggest that the antimicrobial activities of various drugs against bacteria could be successfully measured using both of these methods. Specifically, the data reliability of flow cytometric viability analysis was notably improved as GFP was utilized in the assay. A fluoro-luminometric microplate assay enabled kinetic measurements, which significantly improved and accelerated the assessment of bacterial viability compared to more conventional viability assays such as plate counting. Moreover, the multiparameter assay made simultaneous detection of GFP fluorescence and luciferase bioluminescence possible and provided extensive information about multiple cellular parameters in single assay, thereby increasing the accuracy of the assessment of the kinetics of antimicrobial activities on target bacteria.