8 resultados para GENESIS (Computer system)

em Helda - Digital Repository of University of Helsinki


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Texts in the work of a city department: A study of the language and context of benefit decisions This dissertation examines documents granting or denying the access to municipal services. The data consist of decisions on transport services made by the Social Services Department of the City of Helsinki. The circumstances surrounding official texts and their language and production are studied through textual analysis and interviews. The dissertation describes the textual features of the above decisions, and seeks to explain such features. Also explored are the topics and methods of genre studies, especially the relationship between text and context. Although the approach is linguistic, the dissertation also touches on research in social work and administrative decision making, and contributes to more general discussion on the language and duties of public administration. My key premise is that a text is more than a mere psycholinguistic phenomenon. Rather, a text is also a physical object and the result of certain production processes. This dissertation thus not only describes genre-specific features, but also sheds light on the work that generates the texts examined. Textual analysis and analyses of discursive practices are linked through an analysis of intertextuality: written decisions are compared with other application documents, such as expert statements and the applications themselves. The study shows that decisions are texts governed by strict rules and written with modest resources. Textwork is organised as hierarchical mass production. The officials who write decisions rely on standard phrases extracted from a computer system. This allows them to produce texts of uniform quality which have been approved by the department s legal experts. Using a computer system in text production does not, however, serve all the needs of the writers. This leads to many problems in the texts themselves. Intertextual analysis indicates that medical argumentation weighs most heavily in an application process, although a social appraisal should be carried out when deciding on applications for transport services. The texts reflect a hierarchy in which a physician ranks above the applicant, and the department s own expert physician ranks above the applicant s physician. My analysis also highlights good, but less obvious practices. The social workers and secretaries who write decisions must balance conflicting demands. They use delicate linguistic means to adjust the standard phrases to suit individual cases, and employ subtle strategies of politeness. The dissertation suggests that the customer contact staff who write official texts should be allowed to make better use of their professional competence. A more general concern is that legislation and new management strategies require more and more documentation. Yet, textwork is only rarely taken into account in the allocation of resources. Keywords: (Critical) text analysis, genre analysis, administration, social work, administrative language, texts, genres, context, intertextuality, discursive practices

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As the virtual world grows more complex, finding a standard way for storing data becomes increasingly important. Ideally, each data item would be brought into the computer system only once. References for data items need to be cryptographically verifiable, so the data can maintain its identity while being passed around. This way there will be only one copy of the users family photo album, while the user can use multiple tools to show or manipulate the album. Copies of users data could be stored on some of his family members computer, some of his computers, but also at some online services which he uses. When all actors operate over one replicated copy of the data, the system automatically avoids a single point of failure. Thus the data will not disappear with one computer breaking, or one service provider going out of business. One shared copy also makes it possible to delete a piece of data from all systems at once, on users request. In our research we tried to find a model that would make data manageable to users, and make it possible to have the same data stored at various locations. We studied three systems, Persona, Freenet, and GNUnet, that suggest different models for protecting user data. The main application areas of the systems studied include securing online social networks, providing anonymous web, and preventing censorship in file-sharing. Each of the systems studied store user data on machines belonging to third parties. The systems differ in measures they take to protect their users from data loss, forged information, censorship, and being monitored. All of the systems use cryptography to secure names used for the content, and to protect the data from outsiders. Based on the gained knowledge, we built a prototype platform called Peerscape, which stores user data in a synchronized, protected database. Data items themselves are protected with cryptography against forgery, but not encrypted as the focus has been disseminating the data directly among family and friends instead of letting third parties store the information. We turned the synchronizing database into peer-to-peer web by revealing its contents through an integrated http server. The REST-like http API supports development of applications in javascript. To evaluate the platform’s suitability for application development we wrote some simple applications, including a public chat room, bittorrent site, and a flower growing game. During our early tests we came to the conclusion that using the platform for simple applications works well. As web standards develop further, writing applications for the platform should become easier. Any system this complex will have its problems, and we are not expecting our platform to replace the existing web, but are fairly impressed with the results and consider our work important from the perspective of managing user data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As the virtual world grows more complex, finding a standard way for storing data becomes increasingly important. Ideally, each data item would be brought into the computer system only once. References for data items need to be cryptographically verifiable, so the data can maintain its identity while being passed around. This way there will be only one copy of the users family photo album, while the user can use multiple tools to show or manipulate the album. Copies of users data could be stored on some of his family members computer, some of his computers, but also at some online services which he uses. When all actors operate over one replicated copy of the data, the system automatically avoids a single point of failure. Thus the data will not disappear with one computer breaking, or one service provider going out of business. One shared copy also makes it possible to delete a piece of data from all systems at once, on users request. In our research we tried to find a model that would make data manageable to users, and make it possible to have the same data stored at various locations. We studied three systems, Persona, Freenet, and GNUnet, that suggest different models for protecting user data. The main application areas of the systems studied include securing online social networks, providing anonymous web, and preventing censorship in file-sharing. Each of the systems studied store user data on machines belonging to third parties. The systems differ in measures they take to protect their users from data loss, forged information, censorship, and being monitored. All of the systems use cryptography to secure names used for the content, and to protect the data from outsiders. Based on the gained knowledge, we built a prototype platform called Peerscape, which stores user data in a synchronized, protected database. Data items themselves are protected with cryptography against forgery, but not encrypted as the focus has been disseminating the data directly among family and friends instead of letting third parties store the information. We turned the synchronizing database into peer-to-peer web by revealing its contents through an integrated http server. The REST-like http API supports development of applications in javascript. To evaluate the platform s suitability for application development we wrote some simple applications, including a public chat room, bittorrent site, and a flower growing game. During our early tests we came to the conclusion that using the platform for simple applications works well. As web standards develop further, writing applications for the platform should become easier. Any system this complex will have its problems, and we are not expecting our platform to replace the existing web, but are fairly impressed with the results and consider our work important from the perspective of managing user data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information visualization is a process of constructing a visual presentation of abstract quantitative data. The characteristics of visual perception enable humans to recognize patterns, trends and anomalies inherent in the data with little effort in a visual display. Such properties of the data are likely to be missed in a purely text-based presentation. Visualizations are therefore widely used in contemporary business decision support systems. Visual user interfaces called dashboards are tools for reporting the status of a company and its business environment to facilitate business intelligence (BI) and performance management activities. In this study, we examine the research on the principles of human visual perception and information visualization as well as the application of visualization in a business decision support system. A review of current BI software products reveals that the visualizations included in them are often quite ineffective in communicating important information. Based on the principles of visual perception and information visualization, we summarize a set of design guidelines for creating effective visual reporting interfaces.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fusion power is an appealing source of clean and abundant energy. The radiation resistance of reactor materials is one of the greatest obstacles on the path towards commercial fusion power. These materials are subject to a harsh radiation environment, and cannot fail mechanically or contaminate the fusion plasma. Moreover, for a power plant to be economically viable, the reactor materials must withstand long operation times, with little maintenance. The fusion reactor materials will contain hydrogen and helium, due to deposition from the plasma and nuclear reactions because of energetic neutron irradiation. The first wall divertor materials, carbon and tungsten in existing and planned test reactors, will be subject to intense bombardment of low energy deuterium and helium, which erodes and modifies the surface. All reactor materials, including the structural steel, will suffer irradiation of high energy neutrons, causing displacement cascade damage. Molecular dynamics simulation is a valuable tool for studying irradiation phenomena, such as surface bombardment and the onset of primary damage due to displacement cascades. The governing mechanisms are on the atomic level, and hence not easily studied experimentally. In order to model materials, interatomic potentials are needed to describe the interaction between the atoms. In this thesis, new interatomic potentials were developed for the tungsten-carbon-hydrogen system and for iron-helium and chromium-helium. Thus, the study of previously inaccessible systems was made possible, in particular the effect of H and He on radiation damage. The potentials were based on experimental and ab initio data from the literature, as well as density-functional theory calculations performed in this work. As a model for ferritic steel, iron-chromium with 10% Cr was studied. The difference between Fe and FeCr was shown to be negligible for threshold displacement energies. The properties of small He and He-vacancy clusters in Fe and FeCr were also investigated. The clusters were found to be more mobile and dissociate more rapidly than previously assumed, and the effect of Cr was small. The primary damage formed by displacement cascades was found to be heavily influenced by the presence of He, both in FeCr and W. Many important issues with fusion reactor materials remain poorly understood, and will require a huge effort by the international community. The development of potential models for new materials and the simulations performed in this thesis reveal many interesting features, but also serve as a platform for further studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The management and coordination of business-process collaboration experiences changes because of globalization, specialization, and innovation. Service-oriented computing (SOC) is a means towards businessprocess automation and recently, many industry standards emerged to become part of the service-oriented architecture (SOA) stack. In a globalized world, organizations face new challenges for setting up and carrying out collaborations in semi-automating ecosystems for business services. For being efficient and effective, many companies express their services electronically in what we term business-process as a service (BPaaS). Companies then source BPaaS on the fly from third parties if they are not able to create all service-value inhouse because of reasons such as lack of reasoures, lack of know-how, cost- and time-reduction needs. Thus, a need emerges for BPaaS-HUBs that not only store service offers and requests together with information about their issuing organizations and assigned owners, but that also allow an evaluation of trust and reputation in an anonymized electronic service marketplace. In this paper, we analyze the requirements, design architecture and system behavior of such a BPaaS-HUB to enable a fast setup and enactment of business-process collaboration. Moving into a cloud-computing setting, the results of this paper allow system designers to quickly evaluate which services they need for instantiationg the BPaaS-HUB architecture. Furthermore, the results also show what the protocol of a backbone service bus is that allows a communication between services that implement the BPaaS-HUB. Finally, the paper analyzes where an instantiation must assign additional computing resources vor the avoidance of performance bottlenecks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Parkinson´s Disease (PD) is a neurodegenerative movement disorder resulting from loss of dopaminergic (DA) neurons in substantia nigra (SN). Possible causative treatment strategies for PD include neurotrophic factors, which protect and in some cases restore the function of dopaminergic neurons. Glial cell line-derived neurotrophic factor (GDNF) family of neurotrophic factors have been to date the most promising candidates for treatment of PD, demonstrating both neuroprotective and neurorestorative properties. We have investigated the role of GDNF in the rodent dopaminergic system and its possible crosstalk with other growth factors. We characterized the GDNF-induced gene expression changes by DNA microarray analysis in different neuronal systems, including in vitro cultured Neuro2A cells treated with GDNF, as well as midbrains from GDNF heterozygous (Hz) knockout mice. These microarray experiments, resulted in the identification of GDNF-induced genes, which were also confirmed by other methods. Further analysis of the dopaminergic system of GDNF Hz mice demonstrated about 40% reduction in GDNF levels, revealed increased intracellular dopamine concentrations and FosB/DeltaFosB expression in striatal areas. These animals did not show any significant changes in behavioural analysis of acute and repeated cocaine administration on locomotor activity, nor did they exhibit any changes in dopamine output following treatment with acute cocaine. We further analysed the significance of GDNF receptor RET signalling in dopaminergic system of MEN2B knock-in animals with constitutively active Ret. The MEN2B animals showed a robust increase in extracellular dopamine and its metabolite levels in striatum, increased tyrosine hydroxylase (TH) and dopamine transporter (DAT) protein levels by immunohistochemical staining and Western blotting, as well as increased Th mRNA levels in SN. MEN2B mice had increased number of DA neurons in SN by about 25% and they also exhibited increased sensitivity to the stimulatory effects of cocaine. We also developed a semi-throughput in vitro micro-island assay for the quantification of neuronal survival and TH levels by computer-assisted methodology from limited amounts of tissue. This assay can be applied for the initial screening for dopaminotrophic molecules, as well as chemical drug library screening. It is applicable to any neuronal system for the screening of neurotrophic molecules. Since our microarray experiments revealed possible GDNF-VEGF-C crosstalk we further concentrated on studying the neurotrophic effects of VEGF-C. We showed that VEGF-C acts as a neurotrophic molecule for the DA neurons both in vitro and in vivo, however without additive effect when used together with GDNF. The neuroprotective effect for VEGF-C in vivo in rat 6-OHDA model of PD was demonstrated. The possible signalling mechanisms of VEGF-C in the nervous system were investigated - infusion of VEGF-C to rat brain induced ERK activation, however no direct activation of RET signalling in vitro was found. VEGF-C treatment of rat striatum lead to up-regulation of VEGFR-1-3, indicating that VEGF-C can regulate the expression level of its own receptor. VEGF-C dopaminotrophic activity in vivo was further supported by increased vascular tissue in the neuroprotection experiments.