30 resultados para Web testing
Resumo:
This study deals with algal species occurring commonly in the Baltic Sea: haptophyte Prymnesium parvum, dinoflagellates Dinophysis acuminata, D. norvegica and D. rotundata, and cyanobacterium Nodularia spumigena. The hypotheses are connected to the toxicity of the species, to the factors determining toxicity, to the consequences of toxicity and to the transfer of toxins in the aquatic food web. Since the Baltic Sea is severely eutrophicated, the fast-growing haptophytes have potential in causing toxic blooms. In our studies, the toxicity (as haemolytic activity) of the haptophyte P. parvum was highest under phosphorus-limited conditions, but the cells were toxic also under nitrogen limitation and under nutrient-balanced growth conditions. The cellular nutrient ratios were tightly related to the toxicity. The stoichiometric flexibility for cellular phosphorus quota was higher than for nitrogen, and nitrogen limitation led to decreased biomass. Negative allelopathic effects on another algae (Rhodomonas salina) could be observed already at low P. parvum cell densities, whereas immediate lysis of R. salina cells occurred at P. parvum cell densities corresponding to natural blooms. Release of dissolved organic carbon from the R. salina cells was measured within 30 minutes, and an increase in bacterial number and biomass was measured within 23 h. Because of the allelopathic effect, formation of a P. parvum bloom may accelerate after a critical cell density is reached and the competing species are eliminated. A P. parvum bloom indirectly stimulates bacterial growth, and alters the functioning of the planktonic food web by increasing the carbon transfer through the microbial loop. Our results were the first reports on DSP toxins in Dinophysis cells in the Gulf of Finland and on PTX-2 in the Baltic Sea. Cellular toxin contents in Dinophysis spp. ranged from 0.2 to 149 pg DTX-1 cell-1 and from 1.6 to 19.9 pg PTX-2 cell-1 in the Gulf of Finland. D. norvegica was found mainly around the thermocline (max. 200 cells L-1), whereas D. acuminata was found in the whole mixed layer (max. 7 280 cells L-1). Toxins in the sediment trap corresponded to 1 % of DTX-1 and 0.01 % PTX-2 of the DSP pool in the suspended matter. This indicates that the majority of the DSP toxins does not enter the benthic community, but is either decomposed in the water column, or transferred to higher trophic levels in the planktonic food chain. We found that nodularin, produced by Nodularia spumigena, was transferred to the copepod Eurytemora affinis through three pathways: by grazing on filaments of small Nodularia, directly from the dissolved pool, and through the microbial food web by copepods grazing on ciliates, dinoflagellates and heterotrophic nanoflagellates. The estimated proportion of the microbial food web in nodularin transfer was 22-45 % and 71-76 % in our two experiments, respectively. This highlights the potential role of the microbial food web in the transfer of toxins in the planktonic food web.
Resumo:
In lake ecosystems, both fish and invertebrate predators have dramatic effects on their prey communities. Fish predation selects large cladocerans while invertebrate predators prefer prey of smaller size. Since invertebrate predators are the preferred food items for fish, their occurrence at high densities is often connected with the absence or low number of fish. It is generally believed that invertebrate predators can play a significant role only if the density of planktivorous fish is low. However, in eutrophic clay-turbid Lake Hiidenvesi (southern Finland), a dense population of predatory Chaoborus flavicans larvae coexists with an abundant fish population. The population covers the stratifying area of the lake and attains a maximum population density of 23000 ind. m-2. This thesis aims to clarify the effects of Chaoborus flavicans on the zooplankton community and the environmental factors facilitating the coexistence of fish and invertebrate predators. In the stratifying area of Lake Hiidenvesi, the seasonal succession of cladocerans was exceptional. The spring biomass peak of cladocerans was missing and the highest biomass occurred in midsummer. In early summer, the consumption rate by chaoborids clearly exceeded the production rate of cladocerans and each year the biomass peak of cladocerans coincided with the minimum chaoborid density. In contrast, consumption by fish was very low and each study year cladocerans attained maximum biomass simultaneously with the highest consumption by smelt (Osmerus eperlanus). The results indicated that Chaoborus flavicans was the main predator of cladocerans in the stratifying area of Lake Hiidenvesi. The clay turbidity strongly contributed to the coexistence of chaoborids and smelt at high densities. Turbidity exceeding 30 NTU combined with light intensity below 0.1 μE m-2 s-1provides an efficient daytime refuge for chaoborids, but turbidity alone is not an adequate refuge unless combined with low light intensity. In the non-stratifying shallow basins of Lake Hiidenvesi, light intensity exceeds this level during summer days at the bottom of the lake, preventing Chaoborus forming a dense population in the shallow parts of the lake. Chaoborus can be successful particularly in deep, clay-turbid lakes where they can remain high in the water column close to their epilimnetic prey. Suspended clay alters the trophic interactions by weakening the link between fish and Chaoborus, which in turn strengthens the effect of Chaoborus predation on crustacean zooplankton. Since food web management largely relies on manipulations of fish stocks and the cascading effects of such actions, the validity of the method in deep clay-turbid lakes may be questioned.
Resumo:
Technical or contaminated ethanol products are sometimes ingested either accidentally or on purpose. Typical misused products are black-market liquor and automotive products, e.g., windshield washer fluids. In addition to less toxic solvents, these liquids may contain the deadly methanol. Symptoms of even lethal solvent poisoning are often non-specific at the early stage. The present series of studies was carried out to develop a method for solvent intoxication breath diagnostics to speed up the diagnosis procedure conventionally based on blood tests. Especially in the case of methanol ingestion, the analysis method should be sufficiently sensitive and accurate to determine the presence of even small amounts of methanol from the mixture of ethanol and other less-toxic components. In addition to the studies on the FT-IR method, the Dräger 7110 evidential breath analyzer was examined to determine its ability to reveal a coexisting toxic solvent. An industrial Fourier transform infrared analyzer was modified for breath testing. The sample cell fittings were widened and the cell size reduced in order to get an alveolar sample directly from a single exhalation. The performance and the feasibility of the Gasmet FT-IR analyzer were tested in clinical settings and in the laboratory. Actual human breath screening studies were carried out with healthy volunteers, inebriated homeless men, emergency room patients and methanol-intoxicated patients. A number of the breath analysis results were compared to blood test results in order to approximate the blood-breath relationship. In the laboratory experiments, the analytical performance of the Gasmet FT-IR analyzer and Dräger 7110 evidential breath analyzer was evaluated by means of artificial samples resembling exhaled breath. The investigations demonstrated that a successful breath ethanol analysis by Dräger 7110 evidential breath analyzer could exclude any significant methanol intoxication. In contrast, the device did not detect very high levels of acetone, 1-propanol and 2-propanol in simulated breath. The Dräger 7110 evidential breath ethanol analyzer was not equipped to recognize the interfering component. According to the studies the Gasmet FT-IR analyzer was adequately sensitive, selective and accurate for solvent intoxication diagnostics. In addition to diagnostics, the fast breath solvent analysis proved feasible for controlling the ethanol and methanol concentration during haemodialysis treatment. Because of the simplicity of the sampling and analysis procedure, non-laboratory personnel, such as police officers or social workers, could also operate the analyzer for screening purposes.
Resumo:
The autonomic nervous system is an important modulator of ventricular repolarization and arrhythmia vulnerability. This study explored the effects of cardiovascular autonomic function tests on repolarization and its heterogeneity, with a special reference to congenital arrhythmogenic disorders typically associated with stress-induced fatal ventricular arrhythmias. The first part explored the effects of standardized autonomic tests on QT intervals in a 12-lead electrocardiogram and in multichannel magnetocardiography in 10 healthy adults. The second part studied the effects of deep breathing, Valsalva manouvre, mental stress, sustained handgrip and mild exercise on QT intervals in asymptomatic patients with LQT1 subtype of the hereditary long QT syndrome (n=9) and in patients with arrhythmogenic right ventricular dysplasia (ARVD, n=9). Even strong sympathetic activation had no effects on spatial QT interval dispersion in healthy subjects, but deep respiratory efforts and Valsalva influenced it in ways that were opposite in electrocardiographic and magnetocardiographic recordings. LQT1 patients showed blunted QT interval and sinus nodal responses to sympathetic challenge, as well as an exaggerated QT prolongation during the recovery phases. LQT1 patients showed a QT interval recovery overshoot in 2.4 ± 1.7 tests compared with 0.8 ± 0.7 in healthy controls (P = 0.02). Valsalva strain prolonged the T wave peak to T wave end interval only in the LQT1 patients, considered to reflect the arrhythmogenic substrate in this syndrome. ARVD patients showed signs of abnormal repolarization in the right ventricle, modulated by abrupt sympathetic activation. An electrocardiographic marker reflecting interventricular dispersion of repolarization was introduced. It showed that LQT1 patients exhibit a repolarization gradient from the left ventricle towards the right ventricle, significantly larger than in controls. In contrast, ARVD patients showed a repolarization gradient from the right ventricle towards the left. Valsalva strain amplified the repolarization gradient in LQT1 patients whereas it transiently reversed it in patients with ARVD. In conclusion, intrathoracic volume and pressure changes influence regional electrocardiographic and magnetocardiographic QT interval measurements differently. Especially recovery phases of standard cardiovascular autonomic functions tests and Valsalva manoeuvre reveal the abnormal repolarization in asymptomatic LQT1 patients. Both LQT1 and ARVD patients have abnormal interventricular repolarization gradients, modulated by abrupt sympathetic activation. Autonomic testing and in particular the Valsalva manoeuvre are potentially useful in unmasking abnormal repolarization in these syndromes.
Resumo:
As the virtual world grows more complex, finding a standard way for storing data becomes increasingly important. Ideally, each data item would be brought into the computer system only once. References for data items need to be cryptographically verifiable, so the data can maintain its identity while being passed around. This way there will be only one copy of the users family photo album, while the user can use multiple tools to show or manipulate the album. Copies of users data could be stored on some of his family members computer, some of his computers, but also at some online services which he uses. When all actors operate over one replicated copy of the data, the system automatically avoids a single point of failure. Thus the data will not disappear with one computer breaking, or one service provider going out of business. One shared copy also makes it possible to delete a piece of data from all systems at once, on users request. In our research we tried to find a model that would make data manageable to users, and make it possible to have the same data stored at various locations. We studied three systems, Persona, Freenet, and GNUnet, that suggest different models for protecting user data. The main application areas of the systems studied include securing online social networks, providing anonymous web, and preventing censorship in file-sharing. Each of the systems studied store user data on machines belonging to third parties. The systems differ in measures they take to protect their users from data loss, forged information, censorship, and being monitored. All of the systems use cryptography to secure names used for the content, and to protect the data from outsiders. Based on the gained knowledge, we built a prototype platform called Peerscape, which stores user data in a synchronized, protected database. Data items themselves are protected with cryptography against forgery, but not encrypted as the focus has been disseminating the data directly among family and friends instead of letting third parties store the information. We turned the synchronizing database into peer-to-peer web by revealing its contents through an integrated http server. The REST-like http API supports development of applications in javascript. To evaluate the platform’s suitability for application development we wrote some simple applications, including a public chat room, bittorrent site, and a flower growing game. During our early tests we came to the conclusion that using the platform for simple applications works well. As web standards develop further, writing applications for the platform should become easier. Any system this complex will have its problems, and we are not expecting our platform to replace the existing web, but are fairly impressed with the results and consider our work important from the perspective of managing user data.
Resumo:
As the virtual world grows more complex, finding a standard way for storing data becomes increasingly important. Ideally, each data item would be brought into the computer system only once. References for data items need to be cryptographically verifiable, so the data can maintain its identity while being passed around. This way there will be only one copy of the users family photo album, while the user can use multiple tools to show or manipulate the album. Copies of users data could be stored on some of his family members computer, some of his computers, but also at some online services which he uses. When all actors operate over one replicated copy of the data, the system automatically avoids a single point of failure. Thus the data will not disappear with one computer breaking, or one service provider going out of business. One shared copy also makes it possible to delete a piece of data from all systems at once, on users request. In our research we tried to find a model that would make data manageable to users, and make it possible to have the same data stored at various locations. We studied three systems, Persona, Freenet, and GNUnet, that suggest different models for protecting user data. The main application areas of the systems studied include securing online social networks, providing anonymous web, and preventing censorship in file-sharing. Each of the systems studied store user data on machines belonging to third parties. The systems differ in measures they take to protect their users from data loss, forged information, censorship, and being monitored. All of the systems use cryptography to secure names used for the content, and to protect the data from outsiders. Based on the gained knowledge, we built a prototype platform called Peerscape, which stores user data in a synchronized, protected database. Data items themselves are protected with cryptography against forgery, but not encrypted as the focus has been disseminating the data directly among family and friends instead of letting third parties store the information. We turned the synchronizing database into peer-to-peer web by revealing its contents through an integrated http server. The REST-like http API supports development of applications in javascript. To evaluate the platform s suitability for application development we wrote some simple applications, including a public chat room, bittorrent site, and a flower growing game. During our early tests we came to the conclusion that using the platform for simple applications works well. As web standards develop further, writing applications for the platform should become easier. Any system this complex will have its problems, and we are not expecting our platform to replace the existing web, but are fairly impressed with the results and consider our work important from the perspective of managing user data.
Resumo:
The unique characteristics of marketspace in combination with the fast growing number of consumers interested in e-commerce have created new research areas of interest to both marketing and consumer behaviour researchers. Consumer behaviour researchers interested in the decision making processes of consumers have two new sets of questions to answer. The first set of questions is related to how useful theories developed for a marketplace are in a marketspace context. Cyber auctions, Internet communities and the possibilities for consumers to establish dialogues not only with companies but also with other consumers make marketspace unique. The effects of these distinctive characteristics on the behaviour of consumers have not been systematically analysed and therefore constitute the second set of questions which have to be studied. Most companies feel that they have to be online even though the effects of being on the Net are not unambiguously positive. The relevance of the relationship marketing paradigm in a marketspace context have to be studied. The relationship enhancement effects of websites from the customers’ point of view are therefore emphasized in this research paper. Representatives of the Net-generation were analysed and the results show that companies should develop marketspace strategies while Net presence has a value-added effect on consumers. The results indicate that the decision making processes of the consumers are also changing as a result of the progress of marketspace
Resumo:
The overlapping sound pressure waves that enter our brain via the ears and auditory nerves must be organized into a coherent percept. Modelling the regularities of the auditory environment and detecting unexpected changes in these regularities, even in the absence of attention, is a necessary prerequisite for orientating towards significant information as well as speech perception and communication, for instance. The processing of auditory information, in particular the detection of changes in the regularities of the auditory input, gives rise to neural activity in the brain that is seen as a mismatch negativity (MMN) response of the event-related potential (ERP) recorded by electroencephalography (EEG). --- As the recording of MMN requires neither a subject s behavioural response nor attention towards the sounds, it can be done even with subjects with problems in communicating or difficulties in performing a discrimination task, for example, from aphasic and comatose patients, newborns, and even fetuses. Thus with MMN one can follow the evolution of central auditory processing from the very early, often critical stages of development, and also in subjects who cannot be examined with the more traditional behavioural measures of auditory discrimination. Indeed, recent studies show that central auditory processing, as indicated by MMN, is affected in different clinical populations, such as schizophrenics, as well as during normal aging and abnormal childhood development. Moreover, the processing of auditory information can be selectively impaired for certain auditory attributes (e.g., sound duration, frequency) and can also depend on the context of the sound changes (e.g., speech or non-speech). Although its advantages over behavioral measures are undeniable, a major obstacle to the larger-scale routine use of the MMN method, especially in clinical settings, is the relatively long duration of its measurement. Typically, approximately 15 minutes of recording time is needed for measuring the MMN for a single auditory attribute. Recording a complete central auditory processing profile consisting of several auditory attributes would thus require from one hour to several hours. In this research, I have contributed to the development of new fast multi-attribute MMN recording paradigms in which several types and magnitudes of sound changes are presented in both speech and non-speech contexts in order to obtain a comprehensive profile of auditory sensory memory and discrimination accuracy in a short measurement time (altogether approximately 15 min for 5 auditory attributes). The speed of the paradigms makes them highly attractive for clinical research, their reliability brings fidelity to longitudinal studies, and the language context is especially suitable for studies on language impairments such as dyslexia and aphasia. In addition I have presented an even more ecological paradigm, and more importantly, an interesting result in view of the theory of MMN where the MMN responses are recorded entirely without a repetitive standard tone. All in all, these paradigms contribute to the development of the theory of auditory perception, and increase the feasibility of MMN recordings in both basic and clinical research. Moreover, they have already proven useful in studying for instance dyslexia, Asperger syndrome and schizophrenia.
Resumo:
"Fifty-six teachers, from four European countries, were interviewed to ascertain their attitudes to and beliefs about the Collaborative Learning Environments (CLEs) which were designed under the Innovative Technologies for Collaborative Learning Project. Their responses were analysed using categories based on a model from cultural-historical activity theory [Engestrom, Y. (1987). Learning by expanding.- An activity-theoretical approach to developmental research. Helsinki: Orienta-Konsultit; Engestrom, Y., Engestrom, R., & Suntio, A. (2002). Can a school community learn to master its own future? An activity-theoretical study of expansive learning among middle school teachers. In G. Wells & G. Claxton (Eds.), Learning for life in the 21st century. Oxford: Blackwell Publishers]. The teachers were positive about CLEs and their possible role in initiating pedagogical innovation and enhancing personal professional development. This positive perception held across cultures and national boundaries. Teachers were aware of the fact that demanding planning was needed for successful implementations of CLEs. However, the specific strategies through which the teachers can guide students' inquiries in CLEs and the assessment of new competencies that may characterize student performance in the CLEs were poorly represented in the teachers' reflections on CLEs. The attitudes and beliefs of the teachers from separate countries had many similarities, but there were also some clear differences, which are discussed in the article. (c) 2005 Elsevier Ltd. All rights reserved."
Resumo:
Abstract. Methane emissions from natural wetlands and rice paddies constitute a large proportion of atmospheric methane, but the magnitude and year-to-year variation of these methane sources is still unpredictable. Here we describe and evaluate the integration of a methane biogeochemical model (CLM4Me; Riley et al., 2011) into the Community Land Model 4.0 (CLM4CN) in order to better explain spatial and temporal variations in methane emissions. We test new functions for soil pH and redox potential that impact microbial methane production in soils. We also constrain aerenchyma in plants in always-inundated areas in order to better represent wetland vegetation. Satellite inundated fraction is explicitly prescribed in the model because there are large differences between simulated fractional inundation and satellite observations. A rice paddy module is also incorporated into the model, where the fraction of land used for rice production is explicitly prescribed. The model is evaluated at the site level with vegetation cover and water table prescribed from measurements. Explicit site level evaluations of simulated methane emissions are quite different than evaluating the grid cell averaged emissions against available measurements. Using a baseline set of parameter values, our model-estimated average global wetland emissions for the period 1993–2004 were 256 Tg CH4 yr−1, and rice paddy emissions in the year 2000 were 42 Tg CH4 yr−1. Tropical wetlands contributed 201 Tg CH4 yr−1, or 78 % of the global wetland flux. Northern latitude (>50 N) systems contributed 12 Tg CH4 yr−1. We expect this latter number may be an underestimate due to the low high-latitude inundated area captured by satellites and unrealistically low high-latitude productivity and soil carbon predicted by CLM4. Sensitivity analysis showed a large range (150–346 Tg CH4 yr−1) in predicted global methane emissions. The large range was sensitive to: (1) the amount of methane transported through aerenchyma, (2) soil pH (± 100 Tg CH4 yr−1), and (3) redox inhibition (± 45 Tg CH4 yr−1).
Resumo:
Researchers and developers in academia and industry would benefit from a facility that enables them to easily locate, licence and use the kind of empirical data they need for testing and refining their hypotheses and to deposit and disseminate their data e.g. to support replication and validation of reported scientific experiments. To answer these needs initially in Finland, there is an ongoing project at University of Helsinki and its collaborators to create a user-friendly web service for researchers and developers in Finland and other countries. In our talk, we describe ongoing work to create a palette of extensive but easily available Finnish language resources and technologies for the research community, including lexical resources, wordnets, morphologically tagged corpora, dependency syntactic treebanks and parsebanks, open-source finite state toolkits and libraries and language models to support text analysis and processing at customer site. Also first publicly available results are presented.