3 resultados para miniature

em Helda - Digital Repository of University of Helsinki


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The mobile phone has, as a device, taken the world by storm in the past decade; from only 136 million phones globally in 1996, it is now estimated that by the end of 2008 roughly half of the worlds population will own a mobile phone. Over the years, the capabilities of the phones as well as the networks have increased tremendously, reaching the point where the devices are better called miniature computers rather than simply mobile phones. The mobile industry is currently undertaking several initiatives of developing new generations of mobile network technologies; technologies that to a large extent focus at offering ever-increasing data rates. This thesis seeks to answer the question of whether the future mobile networks in development and the future mobile services are in sync; taking a forward-looking timeframe of five to eight years into the future, will there be services that will need the high-performance new networks being planned? The question is seen to be especially pertinent in light of slower-than-expected takeoff of 3G data services. Current and future mobile services are analyzed from two viewpoints; first, looking at the gradual, evolutionary development of the services and second, through seeking to identify potential revolutionary new mobile services. With information on both current and future mobile networks as well as services, a network capability - service requirements mapping is performed to identify which services will work in which networks. Based on the analysis, it is far from certain whether the new mobile networks, especially those planned for deployment after HSPA, will be needed as soon as they are being currently roadmapped. The true service-based demand for the "beyond HSPA" technologies may be many years into the future - or, indeed, may never materialize thanks to the increasing deployment of local area wireless broadband technologies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modern sample surveys started to spread after statistician at the U.S. Bureau of the Census in the 1940s had developed a sampling design for the Current Population Survey (CPS). A significant factor was also that digital computers became available for statisticians. In the beginning of 1950s, the theory was documented in textbooks on survey sampling. This thesis is about the development of the statistical inference for sample surveys. For the first time the idea of statistical inference was enunciated by a French scientist, P. S. Laplace. In 1781, he published a plan for a partial investigation in which he determined the sample size needed to reach the desired accuracy in estimation. The plan was based on Laplace s Principle of Inverse Probability and on his derivation of the Central Limit Theorem. They were published in a memoir in 1774 which is one of the origins of statistical inference. Laplace s inference model was based on Bernoulli trials and binominal probabilities. He assumed that populations were changing constantly. It was depicted by assuming a priori distributions for parameters. Laplace s inference model dominated statistical thinking for a century. Sample selection in Laplace s investigations was purposive. In 1894 in the International Statistical Institute meeting, Norwegian Anders Kiaer presented the idea of the Representative Method to draw samples. Its idea was that the sample would be a miniature of the population. It is still prevailing. The virtues of random sampling were known but practical problems of sample selection and data collection hindered its use. Arhtur Bowley realized the potentials of Kiaer s method and in the beginning of the 20th century carried out several surveys in the UK. He also developed the theory of statistical inference for finite populations. It was based on Laplace s inference model. R. A. Fisher contributions in the 1920 s constitute a watershed in the statistical science He revolutionized the theory of statistics. In addition, he introduced a new statistical inference model which is still the prevailing paradigm. The essential idea is to draw repeatedly samples from the same population and the assumption that population parameters are constants. Fisher s theory did not include a priori probabilities. Jerzy Neyman adopted Fisher s inference model and applied it to finite populations with the difference that Neyman s inference model does not include any assumptions of the distributions of the study variables. Applying Fisher s fiducial argument he developed the theory for confidence intervals. Neyman s last contribution to survey sampling presented a theory for double sampling. This gave the central idea for statisticians at the U.S. Census Bureau to develop the complex survey design for the CPS. Important criterion was to have a method in which the costs of data collection were acceptable, and which provided approximately equal interviewer workloads, besides sufficient accuracy in estimation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tutkielman kirjallisuuskatsauksessa tarkasteltiin kauran leivontateknologisia ominaisuuksia, entsyymiaktiivista leivontaa ja ruismaltaan hyödyntämistä vähägluteenisessa leivonnassa. Kokeellisessa osiossa tutkittiin ruismallashapantaikinasta valmistetun uutteen vaikutusta kaurataikinan viskositeettiin ja kauraleivän ominaisuuksiin. Työn tarkoituksena oli kehittää maultaan ja rakenteeltaan onnistunut rukiinmakuinen kauraleipä. Ruismaltaan entsyymien annettiin pilkkoa keliaakikolle haitallisia rukiin prolamiineja hapantaikinaprosessissa. Hapantaikinasta erotettiin uute sentrifugoimalla. Leivontakokeisiin käytettiin entsyymiaktiivista ja kuumentamalla inaktivoitua uutetta. Uutteella korvattiin taikinavettä 15, 25 ja 30 % (taikinan painosta). Leivonta toteutettiin miniatyyrikoossa, vuokaleivontana 20 g:n taikinapaloja käyttäen. Taikinoiden viskositeetti mitattiin tarkoituksena seurata beetaglukaanin hydrolyysiä. Rukiin makua mitattiin koulutetun raadin avulla. Happaman uutteen lisäys laski taikinan pH-arvoa noin 5,8:sta noin 4,4:ään. Entsyymiaktiivisen uutteen lisäys laski taikinan viskositeettia ja inaktivoitu uute puolestaan kasvatti sitä. Leipien sisus tiivistyi, jolloin mitatut sisuksen kovuudet kasvoivat uutteen lisäyksen myötä. Uutelisäys paransi leipien makua ja aromia. Uutteen vaikutuksesta leipien huokoset olivat pienempiä ja ne jakaantuivat tasaisemmin leipämatriisiin. Jos uutetta käytettiin inaktivoituna, leipien murenevuus kasvoi. Tutkimuksessa kehitetyn teknologian avulla oli mahdollista valmistaa hyvänlaatuinen, rukiinmakuinen kauraleipä myös ilman että uutteen entsyymit inaktivoitiin keittämällä. Tähän vaikutti ilmeisesti taikinan alhainen pH, joka inhiboi alfa-amylaasia, ja kauratärkkelyksen korkea liisteröitymislämpötila, jolloin entsyymien inaktivoituminen paiston aikana tapahtui ennen kuin tärkkelys tuli alttiiksi liialliselle pilkkoutumiselle. Tämä mahdollistaa uutteen käytön osana leivontaprosessia ilman inaktivointia. Hapantaikinafermentaatio osana gluteenitonta leivontaa havaittiin toimivaksi yhdistelmäksi, sillä se paransi leivän väriä, makua ja rakennetta. Myös leivän homeeton aika parani jo vähäisenkin uutelisäyksen vaikutuksesta. Näyttää siltä, että tämän teknologian avulla on mahdollista tuoda esille pitkään kaivattua rukiin makua vähägluteenisten kauraleipien valikoimassa. Laskennallisesti ja aiempiin tuloksiin tukeutuen, voitiin päätellä, että leivän prolamiinipitoisuudessa on mahdollista päästä tasolle 63,5 mg/kg, mutta jatkokehityksen avulla päästäisiin luultavasti vielä parempiin tuloksiin.