965 resultados para Time-dependent data
Resumo:
A general method for instantaneous and time-dependent serviceability analysis of plane concrete frames is presented. The methodology is based in an extension of the classic matrix formulation for bars. The main aspects influencing the behaviour of the structural concrete are considered: cracking, creep, shrinkage or prestress losses. To simulate the effect of cracking a smeared model (developed in Part II) based on the modification of the tensile branch of the concrete stress-strain relationship is adopted. The general approach considered permits the application to different materials and constitutive laws. Sequential construction (sectional and structural), incorporation of reinforcement, consideration of the loads history; placing and removing shores, and restraining or releasing in boundary conditions are considered. Some examples are included to highlight the capabilities of the model
Resumo:
Automaattisen mittarinluvun yleistyminen ja asiakkaan verkkoliitynnässä käytettävän tekniikan kehittyminen luovat pohjan uudentyyppisen interaktiivisen asiakasrajanpinnan synnylle. Se voi osaltaan mahdollistaa asiakkaan entistä joustavamman sähköverkkoon liitynnän sekä nykyistä reaaliaikaisemmat ja tarkemmat mittaukset. Näiden pohjalle on mahdollista kehittää erilaisia energiatehokkuutta tukevia toimintoja ja niihin perustuvia palveluita. Tämän työn tarkoituksena on tutkia interaktiivisen asiakasrajapinnan mahdollistamia energiatehokkuutta tukevia toimintoja. Lupaavimpia toimintoja, niiden kannattavuutta ja potentiaalia energiatehokkuuden parantamisessa analysoidaan tarkemmin. Lisäksi tarkastellaan niihin tarvittavaa tekniikkaa, mittaustietoja ja tiedonsiirtoa. Nykyinen tekniikka mahdollistaa useiden erilaisten energiatehokkuutta tukevien toimintojen toteuttamisen. Tässä työssä käsiteltiin tarkemmin energiayhtiön AMR-pohjaista tasehallintaa ja sähkön pienkuluttajien hintaohjausta. AMR-pohjaisen tasehallinnan havaittiin olevan oikein kohdennettuna kannattavaa. Sähkön hintaohjaus voi laajassa mittakaavassa toteutettuna olla kannattavaa, mutta yksittäiskohteissa sen toteutuksen kustannukset ovat liian suuret. Suurimpia ongelmia energiatehokkuutta tukevien toimintojen toteutuksen kannalta muodostavat usein kiinteät kustannukset sekä yleisten rajapin-tavaatimusten ja toimintamallien puute. Tuotteiden standardointi, sarjatuotanto sekä tekniikan kehittyminen voivat mahdollistaa kiinteiden kustannusten huomattavan pienenemisen ja tätä kautta toimintojen kustannustehokkuuden paranemisen. Kehittämällä uusia yhteisiä toimintamalleja ja tuotteita voidaan käytettävissä olevaa tekniikkaa hyödyntää tehokkaammin. Myös uudet näköpiirissä olevat nopeammat ja luotettavammat tiedonsiirtotekniikat voivat mahdollistaa reaaliaikaisemmat mittaustietojen ja signaalien välitykset, mikä usein parantaa toimintojen tehokkuutta ja kannattavuutta.
Resumo:
Aspects of visible spectrophotometry can be presented to students using simple experiments in which the color of the crude extract of Macroptilium lathyroides (L.) Urb. is bleached in the presence of nitrite ions in acidic medium. The dependence of the absorption intensity with time, the reaction completeness and the Beer law can be demonstrated. Quantitative results for mineral water samples "contaminated" with nitrite ions were obtained from a method based on the Griess reaction and a procedure based on the bleaching reaction between the crude extract and NO2- ions. Both the Griess and the bleaching reactions were found to be time dependent. Recoveries of about 100 - 104% were obtained with these procedures. The use of natural dyes attracted students' interest enhancing the teaching process. Experiments performed by the teaching staff suggested that the proposed methodology can be performed in a 4 h class, with relative errors ranging from 0.19 to 1.86% in relation to the Griess method.
Resumo:
The set of initial conditions for which the pseudoclassical evolution algorithm (and minimality conservation) is verified for Hamiltonians of degrees N (N>2) is explicitly determined through a class of restrictions for the corresponding classical trajectories, and it is proved to be at most denumerable. Thus these algorithms are verified if and only if the system is quadratic except for a set of measure zero. The possibility of time-dependent a-equivalence classes is studied and its physical interpretation is presented. The implied equivalence of the pseudoclassical and Ehrenfest algorithms and their relationship with minimality conservation is discussed in detail. Also, the explicit derivation of the general unitary operator which linearly transforms minimum-uncertainty states leads to the derivation, among others, of operators with a general geometrical interpretation in phase space, such as rotations (parity, Fourier).
Resumo:
Thermodynamics of homogeneous processes, which corresponds to the very special situation in thermodynamics of continuous media, is used to discuss the first law. An important part of this work is the exposition of some typical mathematical errors, frequently found in the traditional presentation of thermodynamics. The concepts of state and process functions are discussed, as well as reverse and reversible processes, temporality and its implications on thermodynamics, energy reservoirs and symmetry. Our proposal is to present the first law by using a time dependent viewpoint coherent with mechanics and the foundations of that viewpoint.
Resumo:
Selostus artikkelista: The accurary of manually recorded time study data for harvester operation shown via simulator screen. Silva Fennica 42 (2008) : 1, s. 63-72.
Resumo:
The self-aggregation of pheophytin, a possible photosensitizer for Photodynamic Therapy, is solved by formulation in polymeric surfactant as P-123. The photosensitizer incorporation was found to be time dependent, exhibiting two steps: a partition at the micellar interface followed by an incorporation into the micelle core. The photodynamic efficiency of the formulation was tested by the bioassays against Artemia salina. In order to evaluate how the experimental parameters: pheophytin concentration, P-123 percentage and illumination time influenced the death of artemia, the factorial design 2³ was chosen. The illumination time was found to be the main factor contributing to the mortality of artemia.
Resumo:
This work presents the use of potentiometric measurements for kinetic studies of biosorption of Cd2+ ions from aqueous solutions on Eichhornia crassipes roots. The open circuit potential of the Cd/Cd2+ electrode of the first kind was measured during the bioadsorption process. The amount of Cd2+ ions accumulated was determined in real time. The data were fit to different models, with the pseudo-second-order model proving to be the best in describing the data. The advantages and limitations of the methodology proposed relative to the traditional method are discussed.
Resumo:
Blood flow in human aorta is an unsteady and complex phenomenon. The complex patterns are related to the geometrical features like curvature, bends, and branching and pulsatile nature of flow from left ventricle of heart. The aim of this work was to understand the effect of aorta geometry on the flow dynamics. To achieve this, 3D realistic and idealized models of descending aorta were reconstructed from Computed Tomography (CT) images of a female patient. The geometries were reconstructed using medical image processing code. The blood flow in aorta was assumed to be laminar and incompressible and the blood was assumed to be Newtonian fluid. A time dependent pulsatile and parabolic boundary condition was deployed at inlet. Steady and unsteady blood flow simulations were performed in real and idealized geometries of descending aorta using a Finite Volume Method (FVM) code. Analysis of Wall Shear Stress (WSS) distribution, pressure distribution, and axial velocity profiles were carried out in both geometries at steady and unsteady state conditions. The results obtained in thesis work reveal that the idealization of geometry underestimates the values of WSS especially near the region with sudden change of diameter. However, the resultant pressure and velocity in idealized geometry are close to those in real geometry
Resumo:
The Fed model is a widely used market valuation model. It is often used only on market analysis of the S&P 500 index as a shorthand measure for the attractiveness of equity, and as a timing device for allocating funds between equity and bonds. The Fed model assumes a fixed relationship between bond yield and earnings yield. This relationship is often assumed to be true in market valuation. In this paper we test the Fed model from historical perspective on the European markets. The markets of the United States are also includedfor comparison. The purpose of the tests is to determine if the Fed model and the underlying assumptions come true on different markets. The various tests are made on time-series data ranging from the year 1973 to the end of the year 2008. The statistical methods used are regressions analysis, cointegration analysis and Granger causality. The empirical results do not give strong support for the Fed model. The underlying relationships assumed by the Fed model are statistically not valid in most of the markets examined and therefore the model is not valid in valuation purposes generally. The results vary between the different markets which gives reason to suspect the general use of the Fed model in different market conditions and in different markets.
Resumo:
In this Thesis the interaction of an electromagnetic field and matter is studied from various aspects in the general framework of cold atoms. Our subjects cover a wide spectrum of phenomena ranging from semiclassical few-level models to fully quantum mechanical interaction with structured reservoirs leading to non-Markovian open quantum system dynamics. Within closed quantum systems, we propose a selective method to manipulate the motional state of atoms in a time-dependent double-well potential and interpret the method in terms of adiabatic processes. Also, we derive a simple wave-packet model, based on distributions of generalized eigenstates, explaining the finite visibility of interference in overlapping continuous-wave atom lasers. In the context of open quantum systems, we develop an unraveling of non-Markovian dynamics in terms of piecewise deterministic quantum jump processes confined in the Hilbert space of the reduced system - the non-Markovian quantum jump method. As examples, we apply it for simple 2- and 3-level systems interacting with a structured reservoir. Also, in the context of ion-cavity QED we study the entanglement generation based on collective Dicke modes in experimentally realistic conditions including photonic losses and an atomic spontaneous decay.
Resumo:
The goal of this research was to make an overall sight to VIX® and how it can be used as a stock market indicator. Volatility index often referred as the fear index, measures how much it costs for investor to protect his/her S&P 500 position from fluctuations with options. Over the relatively short history of VIX it has been a successful timing coordinator and it has given incremental information about the market state adding its own psychological view of the amount of fear and greed. Correctly utilized VIX information gives a considerable advantage in timing market actions. In this paper we test how VIX works as a leading indicator of broad stock market index such as S&P 500 (SPX). The purpose of this paper is to find a working way to interpret VIX. The various tests are made on time series data ranging from the year 1990 to the year 2010. The 10-day simple moving average strategy gave significant profits from the whole time when VIX data is available. Strategy was able to utilize the increases of SPX in example portfolio value and was able to step aside when SPX was declining. At the times when portfolio was aside of S it was on safety fund like on treasury bills getting an annual yield of 3 percent. On the other side just a static number’s of VIX did not work as indicators in a profit making way.
Resumo:
The purpose of this academic economic geographical dissertation is to study and describe how competitiveness in the Finnish paper industry has developed during 2001–2008. During these years, the Finnish paper industry has faced economically challenging times. This dissertation attempts to fill the existing gap between theoretical and empirical discussions concerning economic geographical issues in the paper industry. The main research questions are: How have the supply chain costs and margins developed during 2001–2008? How do sales prices, transportation, and fixed and variable costs correlate with gross margins in a spatial context? The research object for this case study is a typical large Finnish paper mill that exports over 90 % of its production. The economic longitudinal research data were obtained from the case mill’s controlled economic system and, correlation (R2) analysis was used as the main research method. The time series data cover monthly economic and manufacturing observations from the mill from 2001 to 2008. The study reveals the development of prices, costs and transportation in the case mill, and it shows how economic variables correlate with the paper mills’ gross margins in various markets in Europe. The research methods of economic geography offer perspectives that pay attention to the spatial (market) heterogeneity. This type of research has been quite scarce in the research tradition of Finnish economic geography and supply chain management. This case study gives new insight into the research tradition of Finnish economic geography and supply chain management and its applications. As a concrete empirical result, this dissertation states that the competitive advantages of the Finnish paper industry were significantly weakened during 2001–2008 by low paper prices, costly manufacturing and expensive transportation. Statistical analysis expose that, in several important markets, transport costs lower gross margins as much as decreasing paper prices, which was a new finding. Paper companies should continuously pay attention to lowering manufacturing and transporting costs to achieve more profitable economic performance. The location of a mill being far from markets clearly has an economic impact on paper manufacturing, as paper demand is decreasing and oversupply is pressuring paper prices down. Therefore, market and economic forecasting in the paper industry is advantageous at the country and product levels while simultaneously taking into account the economic geographically specific dimensions.
Resumo:
Thermal discomfort inside facilities is one of the factors responsible for low productivity of caprines in the Brazilian Northeast region, because inadequate weather conditions can cause elevated rectal temperature, increased respiratory rate, decreased food ingestion and reduced production. The present paper aimed to study the behavior of physiological thermoregulation of the animals (respiratory rate - RR and rectal temperature - RT) at four different times of the day (8 a.m., 11 a.m., 2 p.m. and 5 p.m.) and their relation to bioclimatic indexes (Temperature Humidity Index - THI, Black Globe Humidity Index - BGHI and Radiant Heat Load - RHL) in order to determine whether the type of covering used in the animals facilities (ceramic covering - CC, asbestos cement covering - AC and straw covering - SC) interferes with the physiology of thermoregulation. The time of data collection was related to the values of environmental and physiological variables. At 2 p.m. it was found the highest values of Radiant Heat Load on the three types of covering. The values of RT and RR were higher at 11 a.m. and 2 p.m., and the straw tile provided better thermal conditions of microclimate for the animals. The increased RR maintained the caprines homeothermy.
Resumo:
Att övervaka förekomsten av giftiga komponenter i naturliga vattendrag är nödvändigt för människans välmående. Eftersom halten av föroreningar i naturens ekosystem bör hållas möjligast låg, pågår en ständig jakt efter kemiska analysmetoder med allt lägre detektionsgränser. I dagens läge görs miljöanalyser med dyr och sofistikerad instrumentering som kräver mycket underhåll. Jonselektiva elektroder har flera goda egenskaper som t.ex. bärbarhet, låg energiförbrukning, och dessutom är de relativt kostnadseffektiva. Att använda jonselektiva elektroder vid miljöanalyser är möjligt om deras känslighetsområde kan utvidgas genom att sänka deras detektionsgränser. För att sänka detektionsgränsen för Pb(II)-selektiva elektroder undersöktes olika typer av jonselektiva membran som baserades på polyakrylat-kopolymerer, PVC och PbS/Ag2S. Fast-fas elektroder med membran av PbS/Ag2S är i allmänhet enklare och mer robusta än konventionella elektroder vid spårämnesanalys av joniska föroreningar. Fast-fas elektrodernas detektionsgräns sänktes i detta arbete med en nyutvecklad galvanostatisk polariseringsmetod och de kunde sedan framgångsrikt användas för kvantitativa bestämningar av bly(II)-halter i miljöprov som hade samlats in i den finska skärgården nära tidigare industriområden. Analysresultaten som erhölls med jonselektiva elektroder bekräftades med andra analytiska metoder. Att sänka detektionsgränsen m.hj.a. den nyutvecklade polariseringsmetoden möjliggör bestämning av låga och ultra-låga blyhalter som inte kunde nås med klassisk potentiometri. Den verkliga fördelen med att använda dessa blyselektiva elektroder är möjligheten att utföra mätningar i obehandlade miljöprov trots närvaron av fasta partiklar vilket inte är möjligt att göra med andra analysmetoder. Jag väntar mig att den nyutvecklade polariseringsmetoden kommer att sätta en trend i spårämnesanalys med jonselektiva elektroder.