49 resultados para Processing Technologies


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tutkimus käsittelee verkko-opetusinnovaation leviämistä perusasteen ja lukion maantieteeseen vuosina 1998–2004. Työssä sovellettiin opetusinnovaation leviämismallia ja innovaatioiden diffuusioteoriaa. Aineisto hankittiin seitsemänä vuotena kyselylomakkeilla maantieteen verkko-opetuksen edelläkävijäopettajilta, jotka palauttivat 326 lomaketta. Tutkimuksen pääongelmat olivat 1) Millaisia edellytyksiä edelläkävijäopettajilla on käyttää verkko-opetusta koulun maantieteessä? 2) Mitä sovelluksia ja millä tavoin edelläkävijäopettajat käyttävät maantieteen verkko-opetuksessa? 3) Millaisia käyttökokemuksia edelläkävijäopettajat ovat saaneet maantieteen verkko-opetuksesta? Tutkimuksessa havaittiin, että tietokoneiden riittämätön määrä ja puuttuminen aineluokasta vaikeuttivat maantieteen verkko-opetusta. Työssä kehitettiin opettajien digitaalisten mediataitojen kuutiomalli, johon kuuluvat tekniset taidot, informaation prosessointitaidot ja viestintätaidot. Opettajissa erotettiin kolme verkko-opetuksen käyttäjätyyppiä: informaatiohakuiset kevytkäyttäjät, viestintähakuiset peruskäyttäjät ja yhteistyöhakuiset tehokäyttäjät. Verkko-opetukseen liittyi intensiivisiä myönteisiä ja kielteisiä kokemuksia. Se toi iloa ja motivaatiota opiskeluun. Sitä pidettiin rikastuttavana lisänä, joka haluttiin integroida opetukseen hallitusti. Edelläkävijäopettajat ottivat käyttöön tietoverkoissa olevaa informaatiota ja sovelsivat työvälineohjelmia. He pääsivät alkuun todellisuutta jäljittelevien virtuaalimaailmojen: satelliittikuvien toistaman maapallon, digitaalikarttojen ja simulaatioiden käytössä. Opettajat kokeilivat verkon sosiaalisia tiloja reaaliaikaisen viestinnän, keskusteluryhmien ja ryhmätyöohjelmien avulla. Mielikuvitukseen perustuvat virtuaalimaailmat jäivät vähälle sillä opettajat eivät juuri pelanneet viihdepelejä. He omaksuivat virtuaalimaailmoista satunnaisia palasia käytettävissä olevan laite- ja ohjelmavarustuksen mukaan. Virtuaalimaailmojen valtaus eteni tutkimuksen aikana digitaalisen informaation hyödyntämisestä viestintäsovelluksiin ja aloittelevaan yhteistyöhön. Näin opettajat laajensivat virtuaalireviiriään tietoverkkojen dynaamisiksi toimijoiksi ja pääsivät uusin keinoin tyydyttämään ihmisen universaalia tarvetta yhteyteen muiden kanssa. Samalla opettajat valtautuivat informaation kuluttajista sen tuottajiksi, objekteista subjekteiksi. Verkko-opetus avaa koulun maantieteelle huomattavia mahdollisuuksia. Mobiililaitteiden avulla informaatiota voidaan kerätä ja tallentaa maasto-olosuhteissa, ohjelmilla sitä voidaan muuntaa muodosta toiseen. Internetin autenttiset ja ajantasaiset materiaalit tuovat opiskeluun konkretiaa ja kiinnostavuutta, mallit, simulaatiot ja paikkatieto havainnollistavat ilmiöitä. Viestintä- ja yhteistyövälineet sekä sosiaaliset informaatiotilat vahvistavat yhteistyötä. Avainsanat: verkko-opetus, internet, virtuaalimaailmat, maantiede, innovaatiot

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis reports on investigations into the influence of heat treatment on the manufacturing of oat flakes. Sources of variation in the oat flake quality are reviewed, including the whole chain from the farm to the consumer. The most important quality parameters of oat flakes are the absence of lipid hydrolysing enzymes, specific weight, thickness, breakage (fines), water absorption. Flavour, colour and pasting properties are also important, but were not included in the experimental part of this study. Of particular interest was the role of heat processing. The first possible heat treatment may occur already during grain drying, which in Finland generally happens at the farm. At the mill, oats are often kilned to stabilise the product by inactivating lipid hydrolysing enzymes. Almost invariably steaming is used during flaking, to soften the groats and reduce flake breakage. This thesis presents the use of a material science approach to investigating a complex system, typical of food processes. A combination of fundamental and empirical rheological measurements was used together with a laboratory scale process to simulate industrial processing. The results were verified by means of industrial trials. Industrially produced flakes at three thickness levels (nominally 0.75, 0.85 and 0.90 mm) were produced from kilned and unkilned oat groats, and the flake strength was measured at different moisture contents. Kilning was not found to significantly affect the force required to puncture a flake with a 2mm cylindrical probe, which was taken as a measure of flake strength. To further investigate how heat processing contributes to flake quality, dynamic mechanical analysis was used to characterise the effect of heat on the mechanical properties of oats. A marked stiffening of the groat, of up to about 50% increase in storage modulus, was observed during first heating at around 36 to 57°C. This was also observed in tablets prepared from ground groats and extracted oat starch. This stiffening was thus attributed to increased adhesion between starch granules. Groats were steamed in a laboratory steamer and were tempered in an oven at 80 110°C for 30 90 min. The maximum force required to compress the steamed groats to 50% strain increased from 50.7 N to 57.5 N as the tempering temperature was increased from 80 to 110°C. Tempering conditions also affected water absorption. A significantly higher moisture content was observed for kilned (18.9%) compared to unkilned (17.1%) groats, but otherwise had no effect on groat height, maximum force or final force after a 5 s relaxation time. Flakes were produced from the tempered groats using a laboratory flaking machine, using a roll gap of 0.4 mm. Apart from specific weight, flake properties were not influenced by kilning. Tempering conditions however had significant effects on the specific weight, thickness and water absorption of the flakes, as well as on the amount of fine material (<2 mm) produced during flaking. Flake strength correlated significantly with groat strength and flake thickness. Trial flaking at a commercial mill confirmed that groat temperature after tempering influenced water absorption. Variation in flake strength was observed , but at the groat temperatures required to inactivate lipase, it was rather small. Cold flaking of groats resulted in soft, floury flakes. The results presented in this thesis suggest that heating increased the adhesion between starch granules. This resulted in an increase in the stiffness and brittleness of the groat. Brittle fracture, rather than plastic flow, during flaking could result in flaws and cracks in the flake. These would be expected to increase water absorption. This was indeed observed as tempering temperature increased. Industrial trials, conducted with different groat temperatures, confirmed the main findings of the laboratory experiments. The approach used in the present study allowed the systematic study of the effect of interacting process parameters on product quality. There have been few scientific studies of oat processing, and these results can be used to understand the complex effects of process variables on flake quality. They also offer an insight into what happens as the oat groat is deformed into a flake.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Milk microfiltration (0.05-0.2 um) is a membrane separation technique which divides milk components into casein-enriched and native whey fractions. Hitherto the effect of intensive microfiltration including a diafiltration step for both cheese and whey processing has not been studied. The microfiltration performance of skimmed milk was studied with polymeric and ceramic MF membranes. The changes caused by decreased concentration of milk lactose, whey protein and ash content for cheese milk quality and ripening were studied. The effects of cheese milk modification on the milk coagulation properties, cheese recovery yield, cheese composition, ripening and sensory quality as well as on the whey recovery yield and composition by microfiltration were studied. The functional properties of whey protein concentrate from native whey were studied and the detailed composition of whey protein concentrate powders made from cheese wheys after cheese milk pretreatments such as high temperature heat treatment (HH), microfiltration (MF) and ultrafiltration (UF) were compared. The studied polymeric spiral wound microfiltration membranes had 38.5% lower energy consumption, 30.1% higher retention of whey proteins to milk retentate and 81.9% lower permeate flux values compared to ceramic membranes. All studied microfiltration membranes were able to separate main whey proteins from skimmed milk. The optimal lactose content of Emmental cheese milk exceeded 3.2% and reduction of whey proteins and ash content of cheese milk with high concentration factor (CF) values increased the rate of cheese ripening. Reduction of whey protein content in cheese milk increased the concentration of caseinomacropeptide (CMP) of total proteins in cheese whey. Reduction of milk whey protein, lactose and ash content reduces milk rennet clotting time and increased the firmness of the coagulum. Cheese yield calculated from raw milk to cheese was lower with microfiltrated milks due to native whey production. Amounts of a-lactalbumin (a-LA) and b-lactoglobulin (b-LG) were significantly higher in the reference whey, indicating that HH, MF and UF milk pretreatments decrease the amounts of these valuable whey proteins in whey. Even low CF values in milk microfiltration (CF 1.4) reduced nutritional value of cheese whey. From the point of view of utilization of milk components it would be beneficial if the amount of native whey and the CMP content of cheese whey could be maximized. Whey protein concentrate powders made of native whey had excellent functional properties and their detailed amino acid composition differed from those of cheese whey protein concentrate powders.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The average daily intake of folate, one of the B vitamins, falls below recommendations among the Finnish population. Bread and cereals are the main sources of folate, rye being the most significant single source. Processing is a prerequisite for the consumption of whole grain rye; however, little is known about the effect of processing on folates. Moreover, data on the bioavailability of endogenous cereal folates are scarce. The aim of this study was to examine the variation in as well as the effect of fermentation, germination, and thermal processes on folate contents in rye. Bioavailability of endogenous rye folates was investigated in a four-week human intervention study. One of the objectives throughout the work was to optimise and evaluate analytical methods for determining folate contents in cereals. Affinity chromatographic purification followed by high-performance liquid chromatography (HPLC) was a suitable method for analysing cereal products for folate vitamers, and microbiological assay with Lactobacillus rhamnosus reliably quantified the total folate. However, HPLC gave approximately 30% lower results than the microbiological assay. The folate content of rye was high and could be further increased by targeted processing. The vitamer distribution of whole grain rye was characterised by a large proportion of formylated vitamers followed by 5-methyltetrahydrofolate. In sourdough fermentation of rye, the studied yeasts synthesized and lactic acid bacteria mainly depleted folate. Two endogenous bacteria isolated from rye flour were found to produce folate during fermentation. Inclusion of baker s yeast in sourdough fermentation raised the folate level so that the bread could contain more folate than the flour it was made of. Germination markedly increased the folate content of rye, with particularly high folate concentrations in hypocotylar roots. Thermal treatments caused significant folate losses but the preceding germination compensated well for the losses. In the bioavailability study, moderate amounts of endogenous folates in the form of different rye products and orange juice incorporated in the diet improved the folate status among healthy adults. Endogenous folates from rye and orange juice showed similar bioavailability to folic acid from fortified white bread. In brief, it was shown that the folate content of rye can be enhanced manifold by optimising and combining food processing techniques. This offers some practical means to increase the daily intake of folate in a bioavailable form.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Taita Hills in southeastern Kenya form the northernmost part of Africa’s Eastern Arc Mountains, which have been identified by Conservation International as one of the top ten biodiversity hotspots on Earth. As with many areas of the developing world, over recent decades the Taita Hills have experienced significant population growth leading to associated major changes in land use and land cover (LULC), as well as escalating land degradation, particularly soil erosion. Multi-temporal medium resolution multispectral optical satellite data, such as imagery from the SPOT HRV, HRVIR, and HRG sensors, provides a valuable source of information for environmental monitoring and modelling at a landscape level at local and regional scales. However, utilization of multi-temporal SPOT data in quantitative remote sensing studies requires the removal of atmospheric effects and the derivation of surface reflectance factor. Furthermore, for areas of rugged terrain, such as the Taita Hills, topographic correction is necessary to derive comparable reflectance throughout a SPOT scene. Reliable monitoring of LULC change over time and modelling of land degradation and human population distribution and abundance are of crucial importance to sustainable development, natural resource management, biodiversity conservation, and understanding and mitigating climate change and its impacts. The main purpose of this thesis was to develop and validate enhanced processing of SPOT satellite imagery for use in environmental monitoring and modelling at a landscape level, in regions of the developing world with limited ancillary data availability. The Taita Hills formed the application study site, whilst the Helsinki metropolitan region was used as a control site for validation and assessment of the applied atmospheric correction techniques, where multiangular reflectance field measurements were taken and where horizontal visibility meteorological data concurrent with image acquisition were available. The proposed historical empirical line method (HELM) for absolute atmospheric correction was found to be the only applied technique that could derive surface reflectance factor within an RMSE of < 0.02 ps in the SPOT visible and near-infrared bands; an accuracy level identified as a benchmark for successful atmospheric correction. A multi-scale segmentation/object relationship modelling (MSS/ORM) approach was applied to map LULC in the Taita Hills from the multi-temporal SPOT imagery. This object-based procedure was shown to derive significant improvements over a uni-scale maximum-likelihood technique. The derived LULC data was used in combination with low cost GIS geospatial layers describing elevation, rainfall and soil type, to model degradation in the Taita Hills in the form of potential soil loss, utilizing the simple universal soil loss equation (USLE). Furthermore, human population distribution and abundance were modelled with satisfactory results using only SPOT and GIS derived data and non-Gaussian predictive modelling techniques. The SPOT derived LULC data was found to be unnecessary as a predictor because the first and second order image texture measurements had greater power to explain variation in dwelling unit occurrence and abundance. The ability of the procedures to be implemented locally in the developing world using low-cost or freely available data and software was considered. The techniques discussed in this thesis are considered equally applicable to other medium- and high-resolution optical satellite imagery, as well the utilized SPOT data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years, XML has been widely adopted as a universal format for structured data. A variety of XML-based systems have emerged, most prominently SOAP for Web services, XMPP for instant messaging, and RSS and Atom for content syndication. This popularity is helped by the excellent support for XML processing in many programming languages and by the variety of XML-based technologies for more complex needs of applications. Concurrently with this rise of XML, there has also been a qualitative expansion of the Internet's scope. Namely, mobile devices are becoming capable enough to be full-fledged members of various distributed systems. Such devices are battery-powered, their network connections are based on wireless technologies, and their processing capabilities are typically much lower than those of stationary computers. This dissertation presents work performed to try to reconcile these two developments. XML as a highly redundant text-based format is not obviously suitable for mobile devices that need to avoid extraneous processing and communication. Furthermore, the protocols and systems commonly used in XML messaging are often designed for fixed networks and may make assumptions that do not hold in wireless environments. This work identifies four areas of improvement in XML messaging systems: the programming interfaces to the system itself and to XML processing, the serialization format used for the messages, and the protocol used to transmit the messages. We show a complete system that improves the overall performance of XML messaging through consideration of these areas. The work is centered on actually implementing the proposals in a form usable on real mobile devices. The experimentation is performed on actual devices and real networks using the messaging system implemented as a part of this work. The experimentation is extensive and, due to using several different devices, also provides a glimpse of what the performance of these systems may look like in the future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Place identification refers to the process of analyzing sensor data in order to detect places, i.e., spatial areas that are linked with activities and associated with meanings. Place information can be used, e.g., to provide awareness cues in applications that support social interactions, to provide personalized and location-sensitive information to the user, and to support mobile user studies by providing cues about the situations the study participant has encountered. Regularities in human movement patterns make it possible to detect personally meaningful places by analyzing location traces of a user. This thesis focuses on providing system level support for place identification, as well as on algorithmic issues related to the place identification process. The move from location to place requires interactions between location sensing technologies (e.g., GPS or GSM positioning), algorithms that identify places from location data and applications and services that utilize place information. These interactions can be facilitated using a mobile platform, i.e., an application or framework that runs on a mobile phone. For the purposes of this thesis, mobile platforms automate data capture and processing and provide means for disseminating data to applications and other system components. The first contribution of the thesis is BeTelGeuse, a freely available, open source mobile platform that supports multiple runtime environments. The actual place identification process can be understood as a data analysis task where the goal is to analyze (location) measurements and to identify areas that are meaningful to the user. The second contribution of the thesis is the Dirichlet Process Clustering (DPCluster) algorithm, a novel place identification algorithm. The performance of the DPCluster algorithm is evaluated using twelve different datasets that have been collected by different users, at different locations and over different periods of time. As part of the evaluation we compare the DPCluster algorithm against other state-of-the-art place identification algorithms. The results indicate that the DPCluster algorithm provides improved generalization performance against spatial and temporal variations in location measurements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paradigm of computational vision hypothesizes that any visual function -- such as the recognition of your grandparent -- can be replicated by computational processing of the visual input. What are these computations that the brain performs? What should or could they be? Working on the latter question, this dissertation takes the statistical approach, where the suitable computations are attempted to be learned from the natural visual data itself. In particular, we empirically study the computational processing that emerges from the statistical properties of the visual world and the constraints and objectives specified for the learning process. This thesis consists of an introduction and 7 peer-reviewed publications, where the purpose of the introduction is to illustrate the area of study to a reader who is not familiar with computational vision research. In the scope of the introduction, we will briefly overview the primary challenges to visual processing, as well as recall some of the current opinions on visual processing in the early visual systems of animals. Next, we describe the methodology we have used in our research, and discuss the presented results. We have included some additional remarks, speculations and conclusions to this discussion that were not featured in the original publications. We present the following results in the publications of this thesis. First, we empirically demonstrate that luminance and contrast are strongly dependent in natural images, contradicting previous theories suggesting that luminance and contrast were processed separately in natural systems due to their independence in the visual data. Second, we show that simple cell -like receptive fields of the primary visual cortex can be learned in the nonlinear contrast domain by maximization of independence. Further, we provide first-time reports of the emergence of conjunctive (corner-detecting) and subtractive (opponent orientation) processing due to nonlinear projection pursuit with simple objective functions related to sparseness and response energy optimization. Then, we show that attempting to extract independent components of nonlinear histogram statistics of a biologically plausible representation leads to projection directions that appear to differentiate between visual contexts. Such processing might be applicable for priming, \ie the selection and tuning of later visual processing. We continue by showing that a different kind of thresholded low-frequency priming can be learned and used to make object detection faster with little loss in accuracy. Finally, we show that in a computational object detection setting, nonlinearly gain-controlled visual features of medium complexity can be acquired sequentially as images are encountered and discarded. We present two online algorithms to perform this feature selection, and propose the idea that for artificial systems, some processing mechanisms could be selectable from the environment without optimizing the mechanisms themselves. In summary, this thesis explores learning visual processing on several levels. The learning can be understood as interplay of input data, model structures, learning objectives, and estimation algorithms. The presented work adds to the growing body of evidence showing that statistical methods can be used to acquire intuitively meaningful visual processing mechanisms. The work also presents some predictions and ideas regarding biological visual processing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The study examines various uses of computer technology in acquisition of information for visually impaired people. For this study 29 visually impaired persons took part in a survey about their experiences concerning acquisition of infomation and use of computers, especially with a screen magnification program, a speech synthesizer and a braille display. According to the responses, the evolution of computer technology offers an important possibility for visually impaired people to cope with everyday activities and interacting with the environment. Nevertheless, the functionality of assistive technology needs further development to become more usable and versatile. Since the challenges of independent observation of environment were emphasized in the survey, the study led into developing a portable text vision system called Tekstinäkö. Contrary to typical stand-alone applications, Tekstinäkö system was constructed by combining devices and programs that are readily available on consumer market. As the system operates, pictures are taken by a digital camera and instantly transmitted to a text recognition program in a laptop computer that talks out loud the text using a speech synthesizer. Visually impaired test users described that even unsure interpretations of the texts in the environment given by Tekstinäkö system are at least a welcome addition to complete perception of the environment. It became clear that even with a modest development work it is possible to bring new, useful and valuable methods to everyday life of disabled people. Unconventional production process of the system appeared to be efficient as well. Achieved results and the proposed working model offer one suggestion for giving enough attention to easily overlooked needs of the people with special abilities. ACM Computing Classification System (1998): K.4.2 Social Issues: Assistive technologies for persons with disabilities I.4.9 Image processing and computer vision: Applications Keywords: Visually impaired, computer-assisted, information, acquisition, assistive technology, computer, screen magnification program, speech synthesizer, braille display, survey, testing, text recognition, camera, text, perception, picture, environment, trasportation, guidance, independence, vision, disabled, blind, speech, synthesizer, braille, software engineering, programming, program, system, freeware, shareware, open source, Tekstinäkö, text vision, TopOCR, Autohotkey, computer engineering, computer science

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years, XML has been accepted as the format of messages for several applications. Prominent examples include SOAP for Web services, XMPP for instant messaging, and RSS and Atom for content syndication. This XML usage is understandable, as the format itself is a well-accepted standard for structured data, and it has excellent support for many popular programming languages, so inventing an application-specific format no longer seems worth the effort. Simultaneously with this XML's rise to prominence there has been an upsurge in the number and capabilities of various mobile devices. These devices are connected through various wireless technologies to larger networks, and a goal of current research is to integrate them seamlessly into these networks. These two developments seem to be at odds with each other. XML as a fully text-based format takes up more processing power and network bandwidth than binary formats would, whereas the battery-powered nature of mobile devices dictates that energy, both in processing and transmitting, be utilized efficiently. This thesis presents the work we have performed to reconcile these two worlds. We present a message transfer service that we have developed to address what we have identified as the three key issues: XML processing at the application level, a more efficient XML serialization format, and the protocol used to transfer messages. Our presentation includes both a high-level architectural view of the whole message transfer service, as well as detailed descriptions of the three new components. These components consist of an API, and an associated data model, for XML processing designed for messaging applications, a binary serialization format for the data model of the API, and a message transfer protocol providing two-way messaging capability with support for client mobility. We also present relevant performance measurements for the service and its components. As a result of this work, we do not consider XML to be inherently incompatible with mobile devices. As the fixed networking world moves toward XML for interoperable data representation, so should the wireless world also do to provide a better-integrated networking infrastructure. However, the problems that XML adoption has touch all of the higher layers of application programming, so instead of concentrating simply on the serialization format we conclude that improvements need to be made in an integrated fashion in all of these layers.