23 resultados para Information Technology (IT)
em Helda - Digital Repository of University of Helsinki
Resumo:
In the context of health care, information technology (IT) has an important role in the operational infrastructure, ranging from business management to patient care. An essential part of the system is medication management in inpatient and outpatient care. Community pharmacists strategy has been to extend practice responsibilities beyond dispensing towards patient care services. Few studies have evaluated the strategic development of IT systems to support this vision. The objectives of this study were to assess and compare independent Finnish community pharmacy owners and staff pharmacists priorities concerning the content and structure of the next generation of community pharmacy IT systems, to explore international experts visions and strategic views on IT development needs in relation to services provided in community pharmacies, to identify IT innovations facilitating patient care services and to evaluate their development and implementation processes, and to assess community pharmacists readiness to adopt innovations. This study applied both qualitative and quantitative methods. A qualitative personal interview of 14 experts in community pharmacy services and related IT from eight countries and a national survey of Finnish community pharmacy owners (mail survey, response rate 53%, n=308), and of a representative sample of staff pharmacists (online survey, response rate 22%, n=373) were conducted. Finnish independent community pharmacy owners gave priority to logistical functions but also to those related to medication information and patient care. The managers and staff pharmacists have different views of the importance of IT features, reflecting their different professional duties in the community pharmacy. This indicates the need for involving different occupation groups in planning the new IT systems for community pharmacies. A majority of the international experts shared the vision of community pharmacy adopting a patient care orientation; supported by IT-based documentation, new technological solutions, access to information, and shared patient data. Community pharmacy IT innovations were rare, which is paradoxical because owners and staff pharmacists perception of their innovativeness was seen as being high. Community pharmacy IT systems development processes usually had not undergone systematic needs assessment research beforehand or evaluation after the implementation and were most often coordinated by national governments without subsequent commercialization. Specifically, community pharmacy IT developments lack research, organization, leadership and user involvement in the process. Those responsible for IT development in the community pharmacy sector should create long-term IT development strategies that are in line with community pharmacy service development strategies. This could provide systematic guidance for future projects to ensure that potential innovations are based on a sufficient understanding of pharmacy practice problems that they are intended to solve, and to encourage strong leadership in research, development of innovations so that community pharmacists potential innovativeness is used, and that professional needs and strategic priorities will be considered even if the development process is led by those outside the profession.
Resumo:
This article discusses the scope of research on the application of information technology in construction (ITC). A model of the information and material activities which together constitute the construction process is presented, using the IDEF0 activity modelling methodology. Information technology is defined to include all kinds of technology used for the storage, transfer and manipulation of information, thus also including devices such as copying machines, faxes and mobile phones. Using the model the domain of ITC research is defined as the use of information technology to facilitate and re-engineer the information process component of construction. Developments during the last decades in IT use in construction is discussed against a background of a simplified model of generic information processing tasks. The scope of ITC is compared with the scopes of research in related areas such as design methodology, construction management and facilities management. Health care is proposed as an interesting alternative (to the often used car manufacturing industry), as an IT application domain to compare with. Some of the key areas of ITC research in recent years; expert systems, company IT strategies, and product modelling are shortly discussed. The article finishes with a short discussion of the problems of applying standard scientific methodology in ITC research, in particular in product model research.
Resumo:
Introduction This case study is based on the experiences with the Electronic Journal of Information Technology in Construction (ITcon), founded in 1995. Development This journal is an example of a particular category of open access journals, which use neither author charges nor subscriptions to finance their operations, but rely largely on unpaid voluntary work in the spirit of the open source movement. The journal has, after some initial struggle, survived its first decade and is now established as one of half-a-dozen peer reviewed journals in its field. Operations The journal publishes articles as they become ready, but creates virtual issues through alerting messages to “subscribers”. It has also started to publish special issues, since this helps in attracting submissions, and also helps in sharing the work-load of review management. From the start the journal adopted a rather traditional layout of the articles. After the first few years the HTML version was dropped and papers are only published in PDF format. Performance The journal has recently been benchmarked against the competing journals in its field. Its acceptance rate of 53% is slightly higher and its average turnaround time of seven months almost a year faster compared to those journals in the sample for which data could be obtained. The server log files for the past three years have also been studied. Conclusions Our overall experience demonstrates that it is possible to publish this type of OA journal, with a yearly publishing volume equal to a quarterly journal and involving the processing of some fifty submissions a year, using a networked volunteer-based organization.
Resumo:
This thesis explores the relationship between humans and ICTs (information and communication technologies). As ICTs are increasingly penetrating all spheres of social life, their role as mediators – between people, between people and information, and even between people and the natural world – is expanding, and they are increasingly shaping social life. Yet, we still know little of how our life is affected by their growing role. Our understanding of the actors and forces driving the accelerating adoption of new ICTs in all areas of life is also fairly limited. This thesis addresses these problems by interpretively exploring the link between ICTs and the shaping of society at home, in the office, and in the community. The thesis builds on empirical material gathered in three research projects, presented in four separate essays. The first project explores computerized office work through a case study. The second is a regional development project aiming at increasing ICT knowledge and use in 50 small-town families. In the third, the second project is compared to three other longitudinal development projects funded by the European Union. Using theories that consider the human-ICT relationship as intertwined, the thesis provides a multifaceted description of life with ICTs in contemporary information society. By oscillating between empirical and theoretical investigations and balancing between determinist and constructivist conceptualisations of the human-ICT relationship, I construct a dialectical theoretical framework that can be used for studying socio-technical contexts in society. This framework helps us see how societal change stems from the complex social processes that surround routine everyday actions. For example, interacting with and through ICTs may change individuals’ perceptions of time and space, social roles, and the proper ways to communicate – changes which at some point in time result in societal change in terms of, for example, new ways of acting and knowing things.
Resumo:
Open access is a new model for the publishing of scientific journals enabled by the Internet, in which the published articles are freely available for anyone to read. During the 1990’s hundreds of individual open access journals were founded by groups of academics, supported by grants and unpaid voluntary work. During the last five years other types of open access journals, funded by author charges have started to emerge and also established publishers have started to experiment with different variations of open access. This article reports on the experiences of one open access journal (The Electronic Journal of Information Technology in Construction, ITcon) over its ten year history. In addition to a straightforward account of the lessons learned the journal is also benchmarked against a number of competitors in the same research area and its development is put into the larger perspective of changes in scholarly publishing. The main findings are: That a journal publishing around 20-30 articles per year, equivalent to a typical quarterly journal, can sustainable be produced using an open source like production model. The journal outperforms its competitors in some respects, such as the speed of publication, availability of the results and balanced global distribution of authorship, and is on a par with them in most other respects. The key statistics for ITcon are: Acceptance rate 55 %. Average speed of publication 6-7 months. 801 subscribers to email alerts. Average number of downloads by human readers per paper per month 21.
Resumo:
The World Wide Web provides the opportunity for a radically changed and much more efficient communication process for scientific results. A survey in the closely related domains of construction information technology and construction management was conducted in February 2000, aimed at measuring to what extent these opportunities are already changing the scientific information exchange and how researchers feel about the changes. The paper presents the results based on 236 replies to an extensive Web based questionnaire. 65% of the respondents stated their primary research interest as IT in A/E/C and 20% as construction management and economics. The questions dealt with how researchers find, access and read different sources; how much and what publications they read; how often and to which conferences they travel; how much they publish, and what are the criteria for where they eventually decide to publish. Some of the questions confronted traditional and electronic publishing with one final section dedicated to opinions about electronic publishing. According to the survey researchers already download half of the material that they read digitally from the Web. The most popular method for retrieving an interesting publication is downloading it for free from the author’s or publisher’s website. Researchers are not particularly willing to pay for electronic scientific publications. There is much support for a scenario of electronic journals available totally freely on the Web, where the costs could be covered by for instance professional societies or the publishing university. The shift that the Web is causing seems to be towards the "just in time" reading of literature. Also, frequent users of the Web rely less on scientific publications and tend to read fewer articles. If available with little effort, papers published in traditional journals are preferred; if not, the papers should be on the Web. In these circumstances, the role of paper-based journals published by established publishers is shifting from the core "information exchange" to the building of authors' prestige. The respondents feel they should build up their reputations by publishing in journals and relevant conferences, but then make their work freely available on the Web.
Resumo:
Distraction in the workplace is increasingly more common in the information age. Several tasks and sources of information compete for a worker's limited cognitive capacities in human-computer interaction (HCI). In some situations even very brief interruptions can have detrimental effects on memory. Nevertheless, in other situations where persons are continuously interrupted, virtually no interruption costs emerge. This dissertation attempts to reveal the mental conditions and causalities differentiating the two outcomes. The explanation, building on the theory of long-term working memory (LTWM; Ericsson and Kintsch, 1995), focuses on the active, skillful aspects of human cognition that enable the storage of task information beyond the temporary and unstable storage provided by short-term working memory (STWM). Its key postulate is called a retrieval structure an abstract, hierarchical knowledge representation built into long-term memory that can be utilized to encode, update, and retrieve products of cognitive processes carried out during skilled task performance. If certain criteria of practice and task processing are met, LTWM allows for the storage of large representations for long time periods, yet these representations can be accessed with the accuracy, reliability, and speed typical of STWM. The main thesis of the dissertation is that the ability to endure interruptions depends on the efficiency in which LTWM can be recruited for maintaing information. An observational study and a field experiment provide ecological evidence for this thesis. Mobile users were found to be able to carry out heavy interleaving and sequencing of tasks while interacting, and they exhibited several intricate time-sharing strategies to orchestrate interruptions in a way sensitive to both external and internal demands. Interruptions are inevitable, because they arise as natural consequences of the top-down and bottom-up control of multitasking. In this process the function of LTWM is to keep some representations ready for reactivation and others in a more passive state to prevent interference. The psychological reality of the main thesis received confirmatory evidence in a series of laboratory experiments. They indicate that after encoding into LTWM, task representations are safeguarded from interruptions, regardless of their intensity, complexity, or pacing. However, when LTWM cannot be deployed, the problems posed by interference in long-term memory and the limited capacity of the STWM surface. A major contribution of the dissertation is the analysis of when users must resort to poorer maintenance strategies, like temporal cues and STWM-based rehearsal. First, one experiment showed that task orientations can be associated with radically different patterns of retrieval cue encodings. Thus the nature of the processing of the interface determines which features will be available as retrieval cues and which must be maintained by other means. In another study it was demonstrated that if the speed of encoding into LTWM, a skill-dependent parameter, is slower than the processing speed allowed for by the task, interruption costs emerge. Contrary to the predictions of competing theories, these costs turned out to involve intrusions in addition to omissions. Finally, it was learned that in rapid visually oriented interaction, perceptual-procedural expectations guide task resumption, and neither STWM nor LTWM are utilized due to the fact that access is too slow. These findings imply a change in thinking about the design of interfaces. Several novel principles of design are presented, basing on the idea of supporting the deployment of LTWM in the main task.
Resumo:
In the 1990 s the companies utilizing and producing new information technology, especially so-called new media, were also expected to be forerunners in new forms of work and organization. Researchers anticipated that new, more creative forms of work and the changing content of working life were about to replace old industrial and standardized ways of working. However, research on actual companies in the IT sector revealed a situation where only minor changes to existing organizational forms were seen .Many of the independent companies faced great difficulties trying to survive the rapid changes in the products and production forms in the emerging field. Most of the research on the new media field has been conducted as surveys, and an understanding of the actual everyday work process has remained thin. My research is a longitudinal study of the early phases of one new media company in Finland. The study is an analysis of the challenges the company faced in a rapidly changing business field and the attempts to overcome these challenges. The two main analyses in the study focus on the developmental phases of the company and the disturbances in the production process. Based on these analyses, I study changes and learning at work using the methodological framework of developmental work research. Developmental work research is a Finnish variant of the cultural-historical activity theory applied to the study of learning and transformations at work. The data was gathered over a three-year period of ethnographic fieldwork. I documented the production processes and everyday life in the company as a participant observer. I interviewed key persons, video and audio-taped meetings, followed e-mail correspondence and collected various documents, such as agreements and memos. I developed a systematic method for analyzing the disturbances in the production process by combining the various data sources. The systematic analysis of the disturbances depicted a very complex and only partly managed production process. The production process had a long duration, and no single actor had an understanding of it as a whole. Most of the disturbances had to do with the customer relationships. The nature of the disturbances was latent; they were recognized but not addressed. In the particular production processes that I analyzed, the ending life span of a particular product, a CD-ROM, became obvious. This finding can be interpreted in relation to the developmental phase of the production and the transformation of the field as a whole. Based on the analysis of the developmental phases and the disturbances, I formulate a hypothesis of the contradictions and developmental potentials of the activity studied. The conclusions of the study challenge the existing understanding of how to conceptualize and study organizational learning in production work. Most theories of organizational learning do not address qualitative changes in production nor historical challenges of organizational learning itself. My study opens up a new horizon in understanding organizational learning in a rapidly changing field where a learning culture based on craft or mass production work is insufficient. There is a need for anticipatory and proactive organizational learning. Proactive learning is needed to anticipate the changes in production type, and the life cycles of products.
Resumo:
Increased mass migration, as a result of economic hardship, natural disasters and wars, forces many people to arrive on the shores of cultures very different from those they left. How do they manage the legacy of the past and the challenges of their new everyday life? This is a study of immigrant women living in transnational families that act and communicate across national borders on a near-daily basis. The research was carried out amongst immigrant women who were currently living in Finland. The research asks how transnational everyday life is constructed. As everyday life, due to its mundane nature, is difficult to operationalise for research purposes, mixed data collection methods were needed to capture the passing moments that easily become invisible. Thus, the data were obtained from photographic diaries (459 photographs) taken by the research participants themselves. Additionally, stimulated recall discussions, structured questionnaires and participant observation notes were used to complement the photographic data. A tool for analysing the activities devealed in the data was created on the assumption that a family is an active unit that accommodates the current situation in which it is embedded. Everyday life activities were analysed emphasizing social, modal and spatial dimensions. Important daily moments were placed on a continuum: for me , for immediate others and with immediate others . They portrayed everyday routines and exceptions to it. The data matrix was developed as part of this study. The spatial dimensions formed seven units of activity settings: space for friendship, food, resting, childhood, caring, space to learn and an orderly space. Attention was also paid to the accommodative nature of activities; how women maintain traditions and adapt to Finnish life or re-create new activity patterns. Women s narrations revealed the importance of everyday life. The transnational chain of women across generations and countries, comprised of the daughters, mothers and grandmothers was important. The women showed the need for information technology in their transnational lives. They had an active relationship to religion; the denial or importance of it was obvious. Also arranging one s life in Finnish society was central to their narrations. The analysis exposed everyday activities, showed the importance of social networks and the uniqueness of each woman and family. It revealed everyday life in a structured way. The method of analysis that evolved in this study together with the research findings are of potential use to professionals, allowing the targeting of interventions to improve the everyday lives of immigrants.
Resumo:
Tutkimuksen tavoitteena oli kartoittaa ja kehittää modernia opiskeluympäristöä hyödyntävää kirjallisuuspiiritoimintaa ja sitä kautta edistää oppilaiden mahdollisuuksia osallistua elämyksellistä lukemista painottavaan kirjallisuuspiiriin sekä vahvistaa heidän kiinnostustaan tieto- ja viestintätekniikan käyttöön. Tutkimusaineisto kerättiin lukuvuonna 1997 1998 kyselyin ja haastatteluin sekä observoimalla lähiopetustilanteita ja virtuaaliryhmien sähköpostiviestintää. Lisäksi käytettiin valmiita dokumentteja ja asiakirjoja. Tutkimusjoukkona oli Matildaan osallistuneet 4.-6. luokan oppilaat kuudelta Espoon ala-asteelta sekä neljä tutoria, koordinaattori ja joitakin luokanopettajia Matilda-kouluista. Tutkimusongelmat suuntautuivat pääasiassa matildalaisten kokemuksiin opiskelusta modernissa opiskeluympäristössä, suhteesta tieto- ja viestintätekniikkaan sekä siihen, mitä uutta lukemiseen ja kirjallisuuteen liittyvää he kertoivat oppineensa ja kokeneensa. Koska tutkimuskohteen kaltaisia kokeiluita, joissa yhdistyy kirjallisuuspiiritoiminta ja moderni tieto- ja viestintätekniikka, ei ainakaan Suomessa ole aikaisemmin ollut, päädyttiin tutkimusmetodologian valinnassa kvalitatiiviseen otteeseen. Tutkimusote noudatti tapaustutkimuksen ja kehittämistutkimuksen periaatteita. Vaikka tutkimus oli menetelmältään kvalitatiivinen, aineistoa käsiteltiin myös kvantifioiden, mutta yleistyksiä tehtiin ainoastaan tapauksen suuntaan. Tutkimustulokset osoittivat, että opiskelu oli oppilaille positiivinen kokemus. He pitivät erityisesti etäopiskelun itsenäisestä työskentelystä ja lähiopetuspäivistä. Heidän suhtautumisensa tieto- ja viestintätekniikkaan muuttui positiivisemmaksi ja heidän taitonsa tieto- ja viestintätekniikan käyttäjinä kehittyivät. Oppilaat saivat positiivisia lukukokemuksia, heidän lukualueensa laajeni ja osa heistä kertoi lukutyylinsä muutoksista ja syvenemisestä. Oppilaat kokivat ongelmallisimmaksi hankkeen sekavan tutorointi-systeemin, kiireisen aikataulun sekä tietokoneisiin ja -verkkoon liittyvät ongelmat. kirjallisuuspiiri, kolmilukutaito, lukutaito, kirjallisuus, lukeminen, elämyksellinenlukeminen, kirjallisuudenopetus, moderni opiskeluympäristö, avoin ja joustava opiskeluympäristö, telematiikka, tieto- ja viestintätekniikka reading circle, teaching of literature, tri-literacy, literacy, reading, literature, aestheticreading, modern learning enviroment, open anf flexible learning enviroment, telematics, modern information technology
Resumo:
Department of Forest Resource Management in the University of Helsinki has in years 2004?2007 carried out so-called SIMO -project to develop a new generation planning system for forest management. Project parties are organisations doing most of Finnish forest planning in government, industry and private owned forests. Aim of this study was to find out the needs and requirements for new forest planning system and to clarify how parties see targets and processes in today's forest planning. Representatives responsible for forest planning in each organisation were interviewed one by one. According to study the stand-based system for managing and treating forests continues in the future. Because of variable data acquisition methods with different accuracy and sources, and development of single tree interpretation, more and more forest data is collected without field work. The benefits of using more specific forest data also calls for use of information units smaller than tree stand. In Finland the traditional way to arrange forest planning computation is divided in two elements. After updating the forest data to present situation every stand unit's growth is simulated with different alternative treatment schedule. After simulation, optimisation selects for every stand one treatment schedule so that the management program satisfies the owner's goals in the best possible way. This arrangement will be maintained in the future system. The parties' requirements to add multi-criteria problem solving, group decision support methods as well as heuristic and spatial optimisation into system make the programming work more challenging. Generally the new system is expected to be adjustable and transparent. Strict documentation and free source code helps to bring these expectations into effect. Variable growing models and treatment schedules with different source information, accuracy, methods and the speed of processing are supposed to work easily in system. Also possibilities to calibrate models regionally and to set local parameters changing in time are required. In future the forest planning system will be integrated in comprehensive data management systems together with geographic, economic and work supervision information. This requires a modular method of implementing the system and the use of a simple data transmission interface between modules and together with other systems. No major differences in parties' view of the systems requirements were noticed in this study. Rather the interviews completed the full picture from slightly different angles. In organisation the forest management is considered quite inflexible and it only draws the strategic lines. It does not yet have a role in operative activity, although the need and benefits of team level forest planning are admitted. Demands and opportunities of variable forest data, new planning goals and development of information technology are known. Party organisations want to keep on track with development. One example is the engagement in extensive SIMO-project which connects the whole field of forest planning in Finland.
Resumo:
Large-scale chromosome rearrangements such as copy number variants (CNVs) and inversions encompass a considerable proportion of the genetic variation between human individuals. In a number of cases, they have been closely linked with various inheritable diseases. Single-nucleotide polymorphisms (SNPs) are another large part of the genetic variance between individuals. They are also typically abundant and their measuring is straightforward and cheap. This thesis presents computational means of using SNPs to detect the presence of inversions and deletions, a particular variety of CNVs. Technically, the inversion-detection algorithm detects the suppressed recombination rate between inverted and non-inverted haplotype populations whereas the deletion-detection algorithm uses the EM-algorithm to estimate the haplotype frequencies of a window with and without a deletion haplotype. As a contribution to population biology, a coalescent simulator for simulating inversion polymorphisms has been developed. Coalescent simulation is a backward-in-time method of modelling population ancestry. Technically, the simulator also models multiple crossovers by using the Counting model as the chiasma interference model. Finally, this thesis includes an experimental section. The aforementioned methods were tested on synthetic data to evaluate their power and specificity. They were also applied to the HapMap Phase II and Phase III data sets, yielding a number of candidates for previously unknown inversions, deletions and also correctly detecting known such rearrangements.
Resumo:
Event-based systems are seen as good candidates for supporting distributed applications in dynamic and ubiquitous environments because they support decoupled and asynchronous many-to-many information dissemination. Event systems are widely used, because asynchronous messaging provides a flexible alternative to RPC (Remote Procedure Call). They are typically implemented using an overlay network of routers. A content-based router forwards event messages based on filters that are installed by subscribers and other routers. The filters are organized into a routing table in order to forward incoming events to proper subscribers and neighbouring routers. This thesis addresses the optimization of content-based routing tables organized using the covering relation and presents novel data structures and configurations for improving local and distributed operation. Data structures are needed for organizing filters into a routing table that supports efficient matching and runtime operation. We present novel results on dynamic filter merging and the integration of filter merging with content-based routing tables. In addition, the thesis examines the cost of client mobility using different protocols and routing topologies. We also present a new matching technique called temporal subspace matching. The technique combines two new features. The first feature, temporal operation, supports notifications, or content profiles, that persist in time. The second feature, subspace matching, allows more expressive semantics, because notifications may contain intervals and be defined as subspaces of the content space. We also present an application of temporal subspace matching pertaining to metadata-based continuous collection and object tracking.
Resumo:
Sensor networks represent an attractive tool to observe the physical world. Networks of tiny sensors can be used to detect a fire in a forest, to monitor the level of pollution in a river, or to check on the structural integrity of a bridge. Application-specific deployments of static-sensor networks have been widely investigated. Commonly, these networks involve a centralized data-collection point and no sharing of data outside the organization that owns it. Although this approach can accommodate many application scenarios, it significantly deviates from the pervasive computing vision of ubiquitous sensing where user applications seamlessly access anytime, anywhere data produced by sensors embedded in the surroundings. With the ubiquity and ever-increasing capabilities of mobile devices, urban environments can help give substance to the ubiquitous sensing vision through Urbanets, spontaneously created urban networks. Urbanets consist of mobile multi-sensor devices, such as smart phones and vehicular systems, public sensor networks deployed by municipalities, and individual sensors incorporated in buildings, roads, or daily artifacts. My thesis is that "multi-sensor mobile devices can be successfully programmed to become the underpinning elements of an open, infrastructure-less, distributed sensing platform that can bring sensor data out of their traditional close-loop networks into everyday urban applications". Urbanets can support a variety of services ranging from emergency and surveillance to tourist guidance and entertainment. For instance, cars can be used to provide traffic information services to alert drivers to upcoming traffic jams, and phones to provide shopping recommender services to inform users of special offers at the mall. Urbanets cannot be programmed using traditional distributed computing models, which assume underlying networks with functionally homogeneous nodes, stable configurations, and known delays. Conversely, Urbanets have functionally heterogeneous nodes, volatile configurations, and unknown delays. Instead, solutions developed for sensor networks and mobile ad hoc networks can be leveraged to provide novel architectures that address Urbanet-specific requirements, while providing useful abstractions that hide the network complexity from the programmer. This dissertation presents two middleware architectures that can support mobile sensing applications in Urbanets. Contory offers a declarative programming model that views Urbanets as a distributed sensor database and exposes an SQL-like interface to developers. Context-aware Migratory Services provides a client-server paradigm, where services are capable of migrating to different nodes in the network in order to maintain a continuous and semantically correct interaction with clients. Compared to previous approaches to supporting mobile sensing urban applications, our architectures are entirely distributed and do not assume constant availability of Internet connectivity. In addition, they allow on-demand collection of sensor data with the accuracy and at the frequency required by every application. These architectures have been implemented in Java and tested on smart phones. They have proved successful in supporting several prototype applications and experimental results obtained in ad hoc networks of phones have demonstrated their feasibility with reasonable performance in terms of latency, memory, and energy consumption.
Resumo:
In recent years, XML has been widely adopted as a universal format for structured data. A variety of XML-based systems have emerged, most prominently SOAP for Web services, XMPP for instant messaging, and RSS and Atom for content syndication. This popularity is helped by the excellent support for XML processing in many programming languages and by the variety of XML-based technologies for more complex needs of applications. Concurrently with this rise of XML, there has also been a qualitative expansion of the Internet's scope. Namely, mobile devices are becoming capable enough to be full-fledged members of various distributed systems. Such devices are battery-powered, their network connections are based on wireless technologies, and their processing capabilities are typically much lower than those of stationary computers. This dissertation presents work performed to try to reconcile these two developments. XML as a highly redundant text-based format is not obviously suitable for mobile devices that need to avoid extraneous processing and communication. Furthermore, the protocols and systems commonly used in XML messaging are often designed for fixed networks and may make assumptions that do not hold in wireless environments. This work identifies four areas of improvement in XML messaging systems: the programming interfaces to the system itself and to XML processing, the serialization format used for the messages, and the protocol used to transmit the messages. We show a complete system that improves the overall performance of XML messaging through consideration of these areas. The work is centered on actually implementing the proposals in a form usable on real mobile devices. The experimentation is performed on actual devices and real networks using the messaging system implemented as a part of this work. The experimentation is extensive and, due to using several different devices, also provides a glimpse of what the performance of these systems may look like in the future.