29 resultados para Bespoke game content


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Protein-energy malnutrition and mineral deficiencies are two of the three forms of nutritional deficiencies that affect most developing countries due to inadequate access to food and diets based on a sole crop. Common bean (Phaseolus vulgaris L.) is the staple crop of Nicaragua and it has the potential to improve the nutritional status of the poorest group of the nation. Its high content of both protein and nonhaem iron provides many nutrients, but inhibitors also may prevent absorption of iron and zinc by the human consumer. A proper production chain must be followed to ensure the best grain quality for the consumer. To achieve food security, both production and high nutritional content must be maintained. Four nationally important accessions of common bean, with different harvesting dates, were selected to be submitted to two treatments: to evaluate the impact of storage conditions on the end quality of the grain. The duration of the study was six months with sampling every six weeks, and the two treatments were controlled one stored at 40°C and 75 RH %, and the other was stored in in-situ conditions. Proximate and mineral composition was evaluated as well as tannin, phytate and bioavailability. Significant differences among different accessions were found, being the most significant in protein, Fe and Zn content, tannins and phytate. Protein values ranged from 21-23%. Iron content was 61-81 mg/kg but only 3-4% was bioavailable. Zinc content was 21-25 mg/kg and 10-12% was bioavailable. The concentration of phytate ranged from 8.6-9.6 mg/g while tannin values ranged within 37.7-43.8 mg/g. Storage at high temperatures was demonstrated to have an impact on certain nutritional compounds and proved detrimental to final grain quality. Soluble sugar content and tannin content decreased after six months in both storage conditions, IDF decreased in the in-situ and SDF in the stress. The iron content and bioavailability in INTA Biofortificado were not as outstanding as expected, so experiments should be conducted to compare its iron uptake and delivery with other cultivars.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Yhteenveto: Maankosteusvaihtelut talvella hiekkamaassa

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As the virtual world grows more complex, finding a standard way for storing data becomes increasingly important. Ideally, each data item would be brought into the computer system only once. References for data items need to be cryptographically verifiable, so the data can maintain its identity while being passed around. This way there will be only one copy of the users family photo album, while the user can use multiple tools to show or manipulate the album. Copies of users data could be stored on some of his family members computer, some of his computers, but also at some online services which he uses. When all actors operate over one replicated copy of the data, the system automatically avoids a single point of failure. Thus the data will not disappear with one computer breaking, or one service provider going out of business. One shared copy also makes it possible to delete a piece of data from all systems at once, on users request. In our research we tried to find a model that would make data manageable to users, and make it possible to have the same data stored at various locations. We studied three systems, Persona, Freenet, and GNUnet, that suggest different models for protecting user data. The main application areas of the systems studied include securing online social networks, providing anonymous web, and preventing censorship in file-sharing. Each of the systems studied store user data on machines belonging to third parties. The systems differ in measures they take to protect their users from data loss, forged information, censorship, and being monitored. All of the systems use cryptography to secure names used for the content, and to protect the data from outsiders. Based on the gained knowledge, we built a prototype platform called Peerscape, which stores user data in a synchronized, protected database. Data items themselves are protected with cryptography against forgery, but not encrypted as the focus has been disseminating the data directly among family and friends instead of letting third parties store the information. We turned the synchronizing database into peer-to-peer web by revealing its contents through an integrated http server. The REST-like http API supports development of applications in javascript. To evaluate the platform’s suitability for application development we wrote some simple applications, including a public chat room, bittorrent site, and a flower growing game. During our early tests we came to the conclusion that using the platform for simple applications works well. As web standards develop further, writing applications for the platform should become easier. Any system this complex will have its problems, and we are not expecting our platform to replace the existing web, but are fairly impressed with the results and consider our work important from the perspective of managing user data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As the virtual world grows more complex, finding a standard way for storing data becomes increasingly important. Ideally, each data item would be brought into the computer system only once. References for data items need to be cryptographically verifiable, so the data can maintain its identity while being passed around. This way there will be only one copy of the users family photo album, while the user can use multiple tools to show or manipulate the album. Copies of users data could be stored on some of his family members computer, some of his computers, but also at some online services which he uses. When all actors operate over one replicated copy of the data, the system automatically avoids a single point of failure. Thus the data will not disappear with one computer breaking, or one service provider going out of business. One shared copy also makes it possible to delete a piece of data from all systems at once, on users request. In our research we tried to find a model that would make data manageable to users, and make it possible to have the same data stored at various locations. We studied three systems, Persona, Freenet, and GNUnet, that suggest different models for protecting user data. The main application areas of the systems studied include securing online social networks, providing anonymous web, and preventing censorship in file-sharing. Each of the systems studied store user data on machines belonging to third parties. The systems differ in measures they take to protect their users from data loss, forged information, censorship, and being monitored. All of the systems use cryptography to secure names used for the content, and to protect the data from outsiders. Based on the gained knowledge, we built a prototype platform called Peerscape, which stores user data in a synchronized, protected database. Data items themselves are protected with cryptography against forgery, but not encrypted as the focus has been disseminating the data directly among family and friends instead of letting third parties store the information. We turned the synchronizing database into peer-to-peer web by revealing its contents through an integrated http server. The REST-like http API supports development of applications in javascript. To evaluate the platform s suitability for application development we wrote some simple applications, including a public chat room, bittorrent site, and a flower growing game. During our early tests we came to the conclusion that using the platform for simple applications works well. As web standards develop further, writing applications for the platform should become easier. Any system this complex will have its problems, and we are not expecting our platform to replace the existing web, but are fairly impressed with the results and consider our work important from the perspective of managing user data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In daily life, rich experiences evolve in every environmental and social interaction. Because experience has a strong impact on how people behave, scholars in different fields are interested in understanding what constitutes an experience. Yet even if interest in conscious experience is on the increase, there is no consensus on how such experience should be studied. Whatever approach is taken, the subjective and psychologically multidimensional nature of experience should be respected. This study endeavours to understand and evaluate conscious experiences. First I intro-duce a theoretical approach to psychologically-based and content-oriented experience. In the experiential cycle presented here, classical psychology and orienting-environmental content are connected. This generic approach is applicable to any human-environment interaction. Here I apply the approach to entertainment virtual environments (VEs) such as digital games and develop a framework with the potential for studying experiences in VEs. The development of the methodological framework included subjective and objective data from experiences in the Cave Automatic Virtual Environment (CAVE) and with numerous digital games (N=2,414). The final framework consisted of fifteen factor-analytically formed subcomponents of the sense of presence, involvement and flow. Together, these show the multidimensional experiential profile of VEs. The results present general experiential laws of VEs and show that the interface of a VE is related to (physical) presence, which psychologically means attention, perception and the cognitively evaluated realness and spatiality of the VE. The narrative of the VE elicits (social) presence and involvement and affects emotional outcomes. Psychologically, these outcomes are related to social cognition, motivation and emotion. The mechanics of a VE affect the cognitive evaluations and emotional outcomes related to flow. In addition, at the very least, user background, prior experience and use context affect the experiential variation. VEs are part of many peoples lives and many different outcomes are related to them, such as enjoyment, learning and addiction, depending on who is making the evalua-tion. This makes VEs societally important and psychologically fruitful to study. The approach and framework presented here contribute to our understanding of experiences in general and VEs in particular. The research can provide VE developers with a state-of-the art method (www.eveqgp.fi) that can be utilized whenever new product and service concepts are designed, prototyped and tested.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Drug induced liver injury is one of the frequent reasons for the drug removal from the market. During the recent years there has been a pressure to develop more cost efficient, faster and easier ways to investigate drug-induced toxicity in order to recognize hepatotoxic drugs in the earlier phases of drug development. High Content Screening (HCS) instrument is an automated microscope equipped with image analysis software. It makes the image analysis faster and decreases the risk for an error caused by a person by analyzing the images always in the same way. Because the amount of drug and time needed in the analysis are smaller and multiple parameters can be analyzed from the same cells, the method should be more sensitive, effective and cheaper than the conventional assays in cytotoxicity testing. Liver cells are rich in mitochondria and many drugs target their toxicity to hepatocyte mitochondria. Mitochondria produce the majority of the ATP in the cell through oxidative phosphorylation. They maintain biochemical homeostasis in the cell and participate in cell death. Mitochondria is divided into two compartments by inner and outer mitochondrial membranes. The oxidative phosphorylation happens in the inner mitochondrial membrane. A part of the respiratory chain, a protein called cytochrome c, activates caspase cascades when released. This leads to apoptosis. The aim of this study was to implement, optimize and compare mitochondrial toxicity HCS assays in live cells and fixed cells in two cellular models: human HepG2 hepatoma cell line and rat primary hepatocytes. Three different hepato- and mitochondriatoxic drugs (staurosporine, rotenone and tolcapone) were used. Cells were treated with the drugs, incubated with the fluorescent probes and then the images were analyzed using Cellomics ArrayScan VTI reader. Finally the results obtained after optimizing methods were compared to each other and to the results of the conventional cytotoxicity assays, ATP and LDH measurements. After optimization the live cell method and rat primary hepatocytes were selected to be used in the experiments. Staurosporine was the most toxic of the three drugs and caused most damage to the cells most quickly. Rotenone was not that toxic, but the results were more reproducible and thus it would serve as a good positive control in the screening. Tolcapone was the least toxic. So far the conventional analysis of cytotoxicity worked better than the HCS methods. More optimization needs to be done to get the HCS method more sensitive. This was not possible in this study due to time limit.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper challenges the Kripkean interpretation of a posteriori necessities. It will be demonstrated, by an analysis of classic examples, that the modal content of supposed a posteriori necessities is more complicated than the Kripkean line suggests. We will see that further research is needed concerning the a priori principles underlying all a posteriori necessities. In the course of this analysis it will emerge that the modal content of a posteriori necessities can be best described in terms of a Finean conception of modality – by giving essences priority over modality. The upshot of this is that we might be able to establish the necessity of certain supposed a posteriori necessities by a priori means.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Powders are essential materials in the pharmaceutical industry, being involved in majority of all drug manufacturing. Powder flow and particle size are central particle properties addressed by means of particle engineering. The aim of the thesis was to gain knowledge on powder processing with restricted liquid addition, with a primary focus on particle coating and early granule growth. Furthermore, characterisation of this kind of processes was performed. A thin coating layer of hydroxypropyl methylcellulose was applied on individual particles of ibuprofen in a fluidised bed top-spray process. The polymeric coating improved the flow properties of the powder. The improvement was strongly related to relative humidity, which can be seen as an indicator of a change in surface hydrophilicity caused by the coating. The ibuprofen used in the present study had a d50 of 40 μm and thus belongs to the Geldart group C powders, which can be considered as challenging materials in top-spray coating processes. Ibuprofen was similarly coated using a novel ultrasound-assisted coating method. The results were in line with those obtained from powders coated in the fluidised bed process mentioned above. It was found that the ultrasound-assisted method was capable of coating single particles with a simple and robust setup. Granule growth in a fluidised bed process was inhibited by feeding the liquid in pulses. The results showed that the length of the pulsing cycles is of importance, and can be used to adjust granule growth. Moreover, pulsed liquid feed was found to be of greater significance to granule growth in high inlet air relative humidity. Liquid feed pulsing can thus be used as a tool in particle size targeting in fluidised bed processes and in compensating for changes in relative humidity of the inlet air. The nozzle function of a two-fluid external mixing pneumatic nozzle, typical for small scale pharmaceutical fluidised bed processes, was studied in situ in an ongoing fluidised bed process with particle tracking velocimetry. It was found that the liquid droplets undergo coalescence as they proceed away from the nozzle head. The coalescence was expected to increase droplet speed, which was confirmed in the study. The spray turbulence was studied, and the results showed turbulence caused by the event of atomisation and by the oppositely directed fluidising air. It was concluded that particle tracking velocimetry is a suitable tool for in situ spray characterisation. The light transmission through dense particulate systems was found to carry information on particle size and packing density as expected based on the theory of light scattering by solids. It was possible to differentiate binary blends consisting of components with differences in optical properties. Light transmission showed potential as a rapid, simple and inexpensive tool in characterisation of particulate systems giving information on changes in particle systems, which could be utilised in basic process diagnostics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The EU Directive harmonising copyright, Directive 2001/29/EC, has been implemented in all META-NORD countries. The licensing schemas of open content/open source and META-SHARE as well as CLARIN are discussed shortly. The status of the licensing of tools and resources available at the consortium partners are outlined. The aim of the article is to compare a set of open content and open source license and provide some guidance on the optimal use of licenses provided by META-NET and CLARIN for licensing the tools and resources for the benefit of the language technology community.