34 resultados para SAP BI MOBILE iOS BPC APPLICAZIONE SOFTWARE
em Helda - Digital Repository of University of Helsinki
Resumo:
Sensor networks represent an attractive tool to observe the physical world. Networks of tiny sensors can be used to detect a fire in a forest, to monitor the level of pollution in a river, or to check on the structural integrity of a bridge. Application-specific deployments of static-sensor networks have been widely investigated. Commonly, these networks involve a centralized data-collection point and no sharing of data outside the organization that owns it. Although this approach can accommodate many application scenarios, it significantly deviates from the pervasive computing vision of ubiquitous sensing where user applications seamlessly access anytime, anywhere data produced by sensors embedded in the surroundings. With the ubiquity and ever-increasing capabilities of mobile devices, urban environments can help give substance to the ubiquitous sensing vision through Urbanets, spontaneously created urban networks. Urbanets consist of mobile multi-sensor devices, such as smart phones and vehicular systems, public sensor networks deployed by municipalities, and individual sensors incorporated in buildings, roads, or daily artifacts. My thesis is that "multi-sensor mobile devices can be successfully programmed to become the underpinning elements of an open, infrastructure-less, distributed sensing platform that can bring sensor data out of their traditional close-loop networks into everyday urban applications". Urbanets can support a variety of services ranging from emergency and surveillance to tourist guidance and entertainment. For instance, cars can be used to provide traffic information services to alert drivers to upcoming traffic jams, and phones to provide shopping recommender services to inform users of special offers at the mall. Urbanets cannot be programmed using traditional distributed computing models, which assume underlying networks with functionally homogeneous nodes, stable configurations, and known delays. Conversely, Urbanets have functionally heterogeneous nodes, volatile configurations, and unknown delays. Instead, solutions developed for sensor networks and mobile ad hoc networks can be leveraged to provide novel architectures that address Urbanet-specific requirements, while providing useful abstractions that hide the network complexity from the programmer. This dissertation presents two middleware architectures that can support mobile sensing applications in Urbanets. Contory offers a declarative programming model that views Urbanets as a distributed sensor database and exposes an SQL-like interface to developers. Context-aware Migratory Services provides a client-server paradigm, where services are capable of migrating to different nodes in the network in order to maintain a continuous and semantically correct interaction with clients. Compared to previous approaches to supporting mobile sensing urban applications, our architectures are entirely distributed and do not assume constant availability of Internet connectivity. In addition, they allow on-demand collection of sensor data with the accuracy and at the frequency required by every application. These architectures have been implemented in Java and tested on smart phones. They have proved successful in supporting several prototype applications and experimental results obtained in ad hoc networks of phones have demonstrated their feasibility with reasonable performance in terms of latency, memory, and energy consumption.
Resumo:
Mobile RFID services for the Internet of Things can be created by using RFID as an enabling technology in mobile devices. Humans, devices, and things are the content providers and users of these services. Mobile RFID services can be either provided on mobile devices as stand-alone services or combined with end-to-end systems. When different service solution scenarios are considered, there are more than one possible architectural solution in the network, mobile, and back-end server areas. Combining the solutions wisely by applying the software architecture and engineering principles, a combined solution can be formulated for certain application specific use cases. This thesis illustrates these ideas. It also shows how generally the solutions can be used in real world use case scenarios. A case study is used to add further evidence.
Resumo:
This thesis is an exploratory case study that aims to understand the attitudes affecting adoption of mobile self-services. This study used a demo mobile self-service that could be used by consumers for making address changes. The service was branded with a large and trusted Finnish brand. The theoretical framework that was used consisted of adoption theories of technology, adoption theories of self-service and literature concerning mobile services. The reviewed adoption theories of both technology and self-service had their foundation in IDT or TRA/TPB. Based on the reviewed theories an initial framework was created. The empirical data collection was done through three computer aided group interview sessions with a total of 32 respondents. The data analysis started from the premises of the initial framework. Based on the empirical data the framework was constantly reviewed and altered and the data recoded accordingly. The result of this thesis was a list of attitudinal factors that affect the adoption of a mobile self-service either positively or negatively. The factors that were found to affect the attitudes towards adoption of mobile self-services positively were: that the service was time & place independent and saved time. Most respondents, but not all, also had a positive attitude towards adoption due to ease of use and being mentally compatible with the service. Factors that affected adoption negatively were lack of technical compatibility, perceived risk for high costs and risk for malicious software. The identified factors were triangulated in respect to existing literature and general attitudes towards mobile services.
Resumo:
Migration within the European Union (EU) has increased since the Union was established. Community pharmacies provide open access to health care services and can be the first, most frequently used or even the only contact with a nation s health care system among mobile community residents. In some of the mass-migration areas in Southern Europe, most of the customers may represent mobile citizens of foreign background. This has not always been taken into consideration in the development of community pharmacy services. Mobile patients have been on the EU's health policy agenda, but they have seldom been mentioned in the context of community pharmacies. In most of the EU member states, governments control the specific legislation concerning community pharmacies and there is no harmonised pharmaceutical policy or consistent minimal standards for community pharmacy services in the EU. The aim of this study was to understand medication use, the role of community pharmacies and the symptom mitigation process of mobile community residents. Finns living in Spain were used as an example to examine how community pharmacies in a EU member state meet the needs of mobile community residents. The data were collected by a survey in 2002 (response rate 53%, n= 533) and by five focus group discussions in 2006 (n=30). A large number (70%) of the respondents had moved to Spain for health reasons and suffered from chronic morbidity. Community pharmacies had an important role in the healthcare of mobile community residents and the respondents were mostly satisfied with these services. However, several medication safety risks related to community pharmacy practices were identified: 1) Availability of prescription medicines without prescription (e.g., antibiotics, sleeping pills, Viagra®, asthma medications, cardiovascular medicines, psoriasis medicines and analgesics); 2) Irrational use of medicines (e.g., 41% of antibiotic users had bought their antibiotics without a prescription, and the most common reasons for antibiotic self-medication were symptomatic common colds and sore throats); 3) Language barriers between patients and pharmacy professionals; 4) Lack of medication counselling; 5) Unqualified pharmacy personnel providing pharmacotherapy. A fifth of the respondents reported experiencing problems during pharmacy visits in Spain, and the lack of a common language was the source of most of these problems. The findings of this study indicate that regulations and their enforcement can play a crucial role in actually assuring the rational and safe use of medicines. These results can be used in the development of pharmaceutical and healthcare policies in the EU. It is important to define consistent minimum standards for community pharmacy services in the EU. Then, the increasing number of mobile community residents could access safe and high quality health care services, including community pharmacy services, in every member state within the EU.
Resumo:
Tutkielma käsittelee suomalaisten televisiotekstittäjien ammatillisuutta, käännösprosessia ja digitaalisten tekstitysohjelmien vaikutuksia tekstitysprosessiin ammattitekstittäjien näkökulmasta. Suomen television digitalisoituminen on aiheuttanut mullistuksia myös tekstitysalalla kun tekstitettävä kuvamateriaali on ryhdytty toimittamaan käännöstoimistoille ja tekstittäjille digitaalisena. Teoriaosuudessa käsitellään käännös- ja tekstitystutkimusta sekä koulutusta Suomessa, ammattitaitoa ja ammatillisuutta sekä kääntämisen apukeinoja. Tekstittäminen esitellään erikoistuneena kääntämisen muotona. On kuitenkin myös huomioitava, että kääntäminen on yksi vaihe tekstitysprosessissa. Teoriaosuus päättyy suomalaisten televisiotekstittäjien arjen ja työkentän nykytilanteen käsittelyyn – tekstittäjät työskentelevät monenlaisilla työehdoilla ja laadun kriteerit saatetaan joutua arvioimaan uudelleen. Empiirisen osan alussa esitetään, että suomalaisia televisiotekstittäjiä on haastateltu yllättävän vähän, ja Jääskeläisen ajatuksiin nojaten mainitaan, että tekstittämisen alalla on vielä paljon tutkimatta – etenkin suomalaisesta tekstitysprosessista löytyy tutkittavaa. Tutkimuskohde on ammatikseen televisioon tekstityksiä tekevät kääntäjät. Suomalaiselle tekstitykseen erikoistuneelle käännöstoimistolle työskenteleville tekstittäjille lähetettiin alkutalvesta 2008 kyselylomake, jolla kartoitettiin sekä monivalintakysymyksillä että avoimilla kysymyksillä heidän ammatillisuuttaan, työmenetelmiään, käännös- ja tekstitysprosessiaan, ammattiylpeyttään ja -identiteettiään, ajanhallintaansa, sekä heidän käyttämäänsä digitaalista tekstitysohjelmaa. Tutkimuksessa kävi ilmi, että lähes kolmanneksella vastaajista on ammatistaan neutraali tai jopa negatiivinen käsitys. Näitä tekstittäjiä yhdistää se seikka, että kaikilla on alle 5 vuotta kokemusta alalta. Valtaosa vastanneista on kuitenkin ylpeitä siitä, että toimivat suomen kielen ammattilaisina. Tekstitysprosessi oli lomakkeessa jaettu esikatseluvaiheeseen, käännösvaiheeseen, ajastamisvaiheeseen ja korjauskatseluvaiheeseen. Tekstittäjät pyydettiin mm. arvioimaan tekstitysprosessinsa kokonaiskestoa. Kestoissa ilmeni suuria eroavaisuuksia, joista ainakin osa korreloi kokemuksen kanssa. Runsas puolet vastaajista on hankkinut digitaalisen tekstitysohjelmiston käyttöönsä ja osa ajastaa edelleen käännöstoimistossa muun muassa ohjelmiston kalleuden vuoksi. Digitaalisen ohjelmiston myötä tekstitysprosessiin ja työkäytänteisiin on tullut muutoksia, kun videonauhureista ja televisioista on siirrytty pelkän tietokoneen käyttöön. On mahdollista tehdä etätyötä kaukomailta käsin, kääntää ja ajastaa lomittain tai tehdä esiajastus ja kääntää sitten. Digitaalinen tekniikka on siis mahdollistanut tekstitysprosessin muuttumisen ja vaihtoehtoiset työmenetelmät, mutta kaikista menetelmistä ei välttämättä ole tekstittäjälle hyötyä. Perinteinen tekstitysprosessi (esikatselu, repliikkijakojen merkitseminen käsikirjoitukseen, kääntäminen ja repliikkien laadinta, korjaukset ja tarkastuskatselu) vaikuttaa edelleen tehokkaimmalta. Vaikka työkäytänteet eroavat toisistaan, kokonaiskäsitys on se, että digitalisoitumisen alkukangertelujen jälkeen tekstittäjien työskentely on tehostunut.
Resumo:
Regardless of the existence of antibiotics, infectious diseases are the leading causes of death in the world. Staphylococci cause many infections of varying severity, although they can also exist peacefully in many parts of the human body. Most often Staphylococcus aureus colonises the nose, and that colonisation is considered to be a risk factor for spread of this bacterium. S. aureus is considered to be the most important Staphylococcus species. It poses a challenge to the field of medicine, and one of the most problematic aspects is the drastic increase of the methicillin-resistant S. aureus (MRSA) strains in hospitals and community world-wide, including Finland. In addition, most of the clinical coagulase-negative staphylococcus (CNS) isolates express resistance to methicillin. Methicillin-resistance in S. aureus is caused by the mecA gene that encodes an extra penicillin-binding protein (PBP) 2a. The mecA gene is found in a mobile genomic island called staphylococcal chromosome cassette mec (SCCmec). The SCCmec consists of the mec gene and cassette chromosome recombinase (ccr)gene complexes. The areas of the SCCmec element outside the ccr and mec complex are known as the junkyard J regions. So far, eight types of SCCmec(SCCmec I- SCCmec VIII) and a number of variants have been described. The SCCmec island is an acquired element in S. aureus. Lately, it appears that CNS might be the storage place of the SCCmec that aid the S. aureus by providing it with the resistant elements. The SCCmec is known to exist only in the staphylococci. The aim of the present study was to investigate the horizontal transfer of SCCmec between the S. aureus and CNS. One specific aim was to study whether or not some methicillin-sensitive S. aureus (MSSA) strains are more inclined to receive the SCCmec than others. This was done by comparing the genetic background of clinical MSSA isolates in the health care facilities of the Helsinki and Uusimaa Hospital District in 2001 to the representatives of the epidemic MRSA (EMRSA) genotypes, which have been encountered in Finland during 1992-2004. Majority of the clinical MSSA strains were related to the EMRSA strains. This finding suggests that horizontal transfer of SCCmec from unknown donor(s) to several MSSA background genotypes has occurred in Finland. The molecular characteristics of representative clinical methicillin-resistant S. epidermidis (MRSE) isolates recovered in Finnish hospitals between 1990 and 1998 were also studied, examining their genetic relation to each other and to the internationally recognised MRSE clones as well, so as to ascertain the common traits between the SCCmec elements in MRSE and MRSA. The clinical MRSE strains were genetically related to each other; eleven PFGE types were associated with sequence type ST2 that has been identified world-wide. A single MRSE strain may possess two SCCmec types III and IV, which were recognised among the MRSA strains. Moreover, six months after the onset of an outbreak of MRSA possessing a SCCmec type V in a long-term care facility in Northern Finland (LTCF) in 2003, the SCCmec element of nasally carried methicillin-resistant staphylococci was studied. Among the residents of a LTCF, nasal carriage of MR-CNS was common with extreme diversity of SCCmec types. MRSE was the most prevalent CNS species. Horizontal transfer of SCCmec elements is speculated to be based on the sharing of SCCmec type V between MRSA and MRSE in the same person. Additionally, the SCCmec element of the clinical human S. sciuri isolates was studied. Some of the SCCmec regions were present in S. sciuri and the pls gene was common in it. This finding supports the hypothesis of genetic exchange happening between staphylococcal species. Evaluation of the epidemiology of methicillin-resistant staphylococcal colonisation is necessary in order to understand the apparent emergence of these strains and to develop appropriate control strategies. SCCmec typing is essential for understanding the emergence of MRSA strains from CNS, considering that the MR-CNS may represent the gene pool for the continuous creation of new SCCmec types from which MRSA might originate.
Resumo:
The aim of this thesis is to develop a fully automatic lameness detection system that operates in a milking robot. The instrumentation, measurement software, algorithms for data analysis and a neural network model for lameness detection were developed. Automatic milking has become a common practice in dairy husbandry, and in the year 2006 about 4000 farms worldwide used over 6000 milking robots. There is a worldwide movement with the objective of fully automating every process from feeding to milking. Increase in automation is a consequence of increasing farm sizes, the demand for more efficient production and the growth of labour costs. As the level of automation increases, the time that the cattle keeper uses for monitoring animals often decreases. This has created a need for systems for automatically monitoring the health of farm animals. The popularity of milking robots also offers a new and unique possibility to monitor animals in a single confined space up to four times daily. Lameness is a crucial welfare issue in the modern dairy industry. Limb disorders cause serious welfare, health and economic problems especially in loose housing of cattle. Lameness causes losses in milk production and leads to early culling of animals. These costs could be reduced with early identification and treatment. At present, only a few methods for automatically detecting lameness have been developed, and the most common methods used for lameness detection and assessment are various visual locomotion scoring systems. The problem with locomotion scoring is that it needs experience to be conducted properly, it is labour intensive as an on-farm method and the results are subjective. A four balance system for measuring the leg load distribution of dairy cows during milking in order to detect lameness was developed and set up in the University of Helsinki Research farm Suitia. The leg weights of 73 cows were successfully recorded during almost 10,000 robotic milkings over a period of 5 months. The cows were locomotion scored weekly, and the lame cows were inspected clinically for hoof lesions. Unsuccessful measurements, caused by cows standing outside the balances, were removed from the data with a special algorithm, and the mean leg loads and the number of kicks during milking was calculated. In order to develop an expert system to automatically detect lameness cases, a model was needed. A probabilistic neural network (PNN) classifier model was chosen for the task. The data was divided in two parts and 5,074 measurements from 37 cows were used to train the model. The operation of the model was evaluated for its ability to detect lameness in the validating dataset, which had 4,868 measurements from 36 cows. The model was able to classify 96% of the measurements correctly as sound or lame cows, and 100% of the lameness cases in the validation data were identified. The number of measurements causing false alarms was 1.1%. The developed model has the potential to be used for on-farm decision support and can be used in a real-time lameness monitoring system.
Resumo:
In recent years, XML has been widely adopted as a universal format for structured data. A variety of XML-based systems have emerged, most prominently SOAP for Web services, XMPP for instant messaging, and RSS and Atom for content syndication. This popularity is helped by the excellent support for XML processing in many programming languages and by the variety of XML-based technologies for more complex needs of applications. Concurrently with this rise of XML, there has also been a qualitative expansion of the Internet's scope. Namely, mobile devices are becoming capable enough to be full-fledged members of various distributed systems. Such devices are battery-powered, their network connections are based on wireless technologies, and their processing capabilities are typically much lower than those of stationary computers. This dissertation presents work performed to try to reconcile these two developments. XML as a highly redundant text-based format is not obviously suitable for mobile devices that need to avoid extraneous processing and communication. Furthermore, the protocols and systems commonly used in XML messaging are often designed for fixed networks and may make assumptions that do not hold in wireless environments. This work identifies four areas of improvement in XML messaging systems: the programming interfaces to the system itself and to XML processing, the serialization format used for the messages, and the protocol used to transmit the messages. We show a complete system that improves the overall performance of XML messaging through consideration of these areas. The work is centered on actually implementing the proposals in a form usable on real mobile devices. The experimentation is performed on actual devices and real networks using the messaging system implemented as a part of this work. The experimentation is extensive and, due to using several different devices, also provides a glimpse of what the performance of these systems may look like in the future.
Resumo:
The Ajax approach has outgrown its origin as shorthand for "Asynchronous JavaScript + XML". Three years after its naming, Ajax has become widely adopted by web applications. Therefore, there exists a growing interest in using those applications with mobile devices. This thesis evaluates the presentational capability and measures the performance of five mobile browsers on the Apple iPhone and Nokia models N95 and N800. Performance is benchmarked through user-experienced response times as measured with a stopwatch. 12 Ajax toolkit examples and 8 production-quality applications are targeted, all except one in their real environments. In total, over 1750 observations are analyzed and included in the appendix. Communication delays are not considered; the network connection type is WLAN. Results indicate that the initial loading time of an Ajax application can often exceed 20 seconds. Content reordering may be used to partially overcome this limitation. Proper testing is the key for success: the selected browsers are capable of presenting Ajax applications if their differing implementations are overcome, perhaps using a suitable toolkit.
Resumo:
The mobile phone has, as a device, taken the world by storm in the past decade; from only 136 million phones globally in 1996, it is now estimated that by the end of 2008 roughly half of the worlds population will own a mobile phone. Over the years, the capabilities of the phones as well as the networks have increased tremendously, reaching the point where the devices are better called miniature computers rather than simply mobile phones. The mobile industry is currently undertaking several initiatives of developing new generations of mobile network technologies; technologies that to a large extent focus at offering ever-increasing data rates. This thesis seeks to answer the question of whether the future mobile networks in development and the future mobile services are in sync; taking a forward-looking timeframe of five to eight years into the future, will there be services that will need the high-performance new networks being planned? The question is seen to be especially pertinent in light of slower-than-expected takeoff of 3G data services. Current and future mobile services are analyzed from two viewpoints; first, looking at the gradual, evolutionary development of the services and second, through seeking to identify potential revolutionary new mobile services. With information on both current and future mobile networks as well as services, a network capability - service requirements mapping is performed to identify which services will work in which networks. Based on the analysis, it is far from certain whether the new mobile networks, especially those planned for deployment after HSPA, will be needed as soon as they are being currently roadmapped. The true service-based demand for the "beyond HSPA" technologies may be many years into the future - or, indeed, may never materialize thanks to the increasing deployment of local area wireless broadband technologies.
Resumo:
Free and Open Source Software (FOSS) has gained increased interest in the computer software industry, but assessing its quality remains a challenge. FOSS development is frequently carried out by globally distributed development teams, and all stages of development are publicly visible. Several product and process-level quality factors can be measured using the public data. This thesis presents a theoretical background for software quality and metrics and their application in a FOSS environment. Information available from FOSS projects in three information spaces are presented, and a quality model suitable for use in a FOSS context is constructed. The model includes both process and product quality metrics, and takes into account the tools and working methods commonly used in FOSS projects. A subset of the constructed quality model is applied to three FOSS projects, highlighting both theoretical and practical concerns in implementing automatic metric collection and analysis. The experiment shows that useful quality information can be extracted from the vast amount of data available. In particular, projects vary in their growth rate, complexity, modularity and team structure.
Resumo:
Information visualization is a process of constructing a visual presentation of abstract quantitative data. The characteristics of visual perception enable humans to recognize patterns, trends and anomalies inherent in the data with little effort in a visual display. Such properties of the data are likely to be missed in a purely text-based presentation. Visualizations are therefore widely used in contemporary business decision support systems. Visual user interfaces called dashboards are tools for reporting the status of a company and its business environment to facilitate business intelligence (BI) and performance management activities. In this study, we examine the research on the principles of human visual perception and information visualization as well as the application of visualization in a business decision support system. A review of current BI software products reveals that the visualizations included in them are often quite ineffective in communicating important information. Based on the principles of visual perception and information visualization, we summarize a set of design guidelines for creating effective visual reporting interfaces.