920 resultados para Web 2.0 Applications in Enterprises
Resumo:
Structured abstract: Purpose: LibraryThing is a Web 2.0 tool allowing users to catalogue books using data drawn from sources such as Amazon and the Library of Congress and has facilities such as tagging and interest groups. This study evaluates whether LibraryThing is a valuable tool for libraries to use for promotional and user engagement purposes. Methodology: This study used a sequential mixed methods 3 phase design: (1) the identification of LibraryThing features for user engagement or promotional purposes, (2) exploratory semi-structured interviews (3) a questionnaire. Findings: Several uses of LibraryThing for promotional and user engagement purposes were identified. The most popular reason libraries used LibraryThing was to promote the library or library stock, with most respondents using it specifically to highlight collections of books. Monitoring of patron usage was low and many respondents had not received any feedback. LibraryThing was commonly reported as being easy to use, remotely accessible, and having low cost, whilst its main drawbacks were the 200 book limit for free accounts, and it being a third-party site. The majority of respondents felt LibraryThing was a useful tool for libraries. Practical implications: LibraryThing has most value as a promotional tool for libraries. Libraries should actively monitor patron usage of their LibraryThing account or request user feedback to ensure that LibraryThing provides a truly valuable service for their library. Orginality : There is little research on the value of LibraryThing for libraries, or librarians perceptions of LibraryThing as a Web 2.0 tool.
Resumo:
This paper presents a novel approach to the automatic classification of very large data sets composed of terahertz pulse transient signals, highlighting their potential use in biochemical, biomedical, pharmaceutical and security applications. Two different types of THz spectra are considered in the classification process. Firstly a binary classification study of poly-A and poly-C ribonucleic acid samples is performed. This is then contrasted with a difficult multi-class classification problem of spectra from six different powder samples that although have fairly indistinguishable features in the optical spectrum, they also possess a few discernable spectral features in the terahertz part of the spectrum. Classification is performed using a complex-valued extreme learning machine algorithm that takes into account features in both the amplitude as well as the phase of the recorded spectra. Classification speed and accuracy are contrasted with that achieved using a support vector machine classifier. The study systematically compares the classifier performance achieved after adopting different Gaussian kernels when separating amplitude and phase signatures. The two signatures are presented as feature vectors for both training and testing purposes. The study confirms the utility of complex-valued extreme learning machine algorithms for classification of the very large data sets generated with current terahertz imaging spectrometers. The classifier can take into consideration heterogeneous layers within an object as would be required within a tomographic setting and is sufficiently robust to detect patterns hidden inside noisy terahertz data sets. The proposed study opens up the opportunity for the establishment of complex-valued extreme learning machine algorithms as new chemometric tools that will assist the wider proliferation of terahertz sensing technology for chemical sensing, quality control, security screening and clinic diagnosis. Furthermore, the proposed algorithm should also be very useful in other applications requiring the classification of very large datasets.
Resumo:
Techniques to retrieve reliable images from complicated objects are described, overcoming problems introduced by uneven surfaces, giving enhanced depth resolution and improving image contrast. The techniques are illustrated with application to THz imaging of concealed wall paintings.
Resumo:
This paper presents a novel mobile sink area allocation scheme for consumer based mobile robotic devices with a proven application to robotic vacuum cleaners. In the home or office environment, rooms are physically separated by walls and an automated robotic cleaner cannot make a decision about which room to move to and perform the cleaning task. Likewise, state of the art cleaning robots do not move to other rooms without direct human interference. In a smart home monitoring system, sensor nodes may be deployed to monitor each separate room. In this work, a quad tree based data gathering scheme is proposed whereby the mobile sink physically moves through every room and logically links all separated sub-networks together. The proposed scheme sequentially collects data from the monitoring environment and transmits the information back to a base station. According to the sensor nodes information, the base station can command a cleaning robot to move to a specific location in the home environment. The quad tree based data gathering scheme minimizes the data gathering tour length and time through the efficient allocation of data gathering areas. A calculated shortest path data gathering tour can efficiently be allocated to the robotic cleaner to complete the cleaning task within a minimum time period. Simulation results show that the proposed scheme can effectively allocate and control the cleaning area to the robot vacuum cleaner without any direct interference from the consumer. The performance of the proposed scheme is then validated with a set of practical sequential data gathering tours in a typical office/home environment.
Resumo:
The latest coupled configuration of the Met Office Unified Model (Global Coupled configuration 2, GC2) is presented. This paper documents the model components which make up the configuration (although the scientific description of these components is detailed elsewhere) and provides a description of the coupling between the components. The performance of GC2 in terms of its systematic errors is assessed using a variety of diagnostic techniques. The configuration is intended to be used by the Met Office and collaborating institutes across a range of timescales, with the seasonal forecast system (GloSea5) and climate projection system (HadGEM) being the initial users. In this paper GC2 is compared against the model currently used operationally in those two systems. Overall GC2 is shown to be an improvement on the configurations used currently, particularly in terms of modes of variability (e.g. mid-latitude and tropical cyclone intensities, the Madden–Julian Oscillation and El Niño Southern Oscillation). A number of outstanding errors are identified with the most significant being a considerable warm bias over the Southern Ocean and a dry precipitation bias in the Indian and West African summer monsoons. Research to address these is ongoing.
Resumo:
Background 29 autoimmune diseases, including Rheumatoid Arthritis, gout, Crohn’s Disease, and Systematic Lupus Erythematosus affect 7.6-9.4% of the population. While effective therapy is available, many patients do not follow treatment or use medications as directed. Digital health and Web 2.0 interventions have demonstrated much promise in increasing medication and treatment adherence, but to date many Internet tools have proven disappointing. In fact, most digital interventions continue to suffer from high attrition in patient populations, are burdensome for healthcare professionals, and have relatively short life spans. Objective Digital health tools have traditionally centered on the transformation of existing interventions (such as diaries, trackers, stage-based or cognitive behavioral therapy programs, coupons, or symptom checklists) to electronic format. Advanced digital interventions have also incorporated attributes of Web 2.0 such as social networking, text messaging, and the use of video. Despite these efforts, there has not been little measurable impact in non-adherence for illnesses that require medical interventions, and research must look to other strategies or development methodologies. As a first step in investigating the feasibility of developing such a tool, the objective of the current study is to systematically rate factors of non-adherence that have been reported in past research studies. Methods Grounded Theory, recognized as a rigorous method that facilitates the emergence of new themes through systematic analysis, data collection and coding, was used to analyze quantitative, qualitative and mixed method studies addressing the following autoimmune diseases: Rheumatoid Arthritis, gout, Crohn’s Disease, Systematic Lupus Erythematosus, and inflammatory bowel disease. Studies were only included if they contained primary data addressing the relationship with non-adherence. Results Out of the 27 studies, four non-modifiable and 11 modifiable risk factors were discovered. Over one third of articles identified the following risk factors as common contributors to medication non-adherence (percent of studies reporting): patients not understanding treatment (44%), side effects (41%), age (37%), dose regimen (33%), and perceived medication ineffectiveness (33%). An unanticipated finding that emerged was the need for risk stratification tools (81%) with patient-centric approaches (67%). Conclusions This study systematically identifies and categorizes medication non-adherence risk factors in select autoimmune diseases. Findings indicate that patients understanding of their disease and the role of medication are paramount. An unexpected finding was that the majority of research articles called for the creation of tailored, patient-centric interventions that dispel personal misconceptions about disease, pharmacotherapy, and how the body responds to treatment. To our knowledge, these interventions do not yet exist in digital format. Rather than adopting a systems level approach, digital health programs should focus on cohorts with heterogeneous needs, and develop tailored interventions based on individual non-adherence patterns.
Resumo:
Background. The aim of this study was to evaluate Ki-67 and Bcl-2 antigen expression in colorectal polyps from women with breast cancer. Methods. A randomized, controlled study was carried out in 35 women, either with or without breast cancer, who had adenomatous colorectal polyps. The patients were divided into two groups: group A (without breast cancer; control group; n = 17) and group B (with breast cancer; study group; n = 18). Immunohistochemistry was performed on the colorectal polyps to evaluate Ki-67 and Bcl-2 antigen expression. Student`s t-test and the chi(2) test were used for the statistical analysis of Ki-67 and Bcl-2 expression, respectively. Statistical significance was established as P < 0.05. Results. The mean percentage of Ki-67-stained nuclei in groups A and B was 36.25 +/- 2.31 and 59.44 +/- 3.34 ( SEM), respectively (P < 0.0001), while the percentage of cases with cells expressing Bcl-2 in groups A and B was 23.5 and 77.8%, respectively (P < 0.001). Conclusions. In the present study, there was greater proliferative activity and greater expression of the antiapoptotic protein Bcl-2 in the colorectal polyps of women with breast cancer.
Resumo:
The aim of this study was to evaluate Ki-67 and Bcl-2 protein expression in the normal colorectal mucosa adjacent to adenomatous polyps in women with breast cancer. A cross-sectional, controlled study was conducted in 35 women with and without breast cancer who had adenomatous colorectal polyps. The patients were divided into two groups: Group A (a control group of women without breast cancer, n = 18) and Group B (a study group of women with breast cancer, n = 17). A sample of normal colonic mucosa was collected at a distance of 5 cm from the polypoid lesion to evaluate immunchistochemical expression of the Ki-67 and Bcl-2 proteins. Student`s t-test and the chi-square test were used to analyse Ki-67 and Bcl-2 expression, respectively. Statistical significance was established at p < 0.05. The mean percentage of Ki-67-stained nuclei in Groups A and B was 25.12 +/- 2.08 and 41.50 +/- 1.85, respectively (p < 0.001), whereas the percentage of cases with cells expressing Bcl-2 in Groups A and B was 17.6% and 82.4%, respectively (p < 0.003). In the present study, greater proliferative activity and greater expression of the antiapoptotic protein Bcl-2 was found in the normal colorectal mucosa of women with breast cancer. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Combinatorial optimization problems, are one of the most important types of problems in operational research. Heuristic and metaheuristics algorithms are widely applied to find a good solution. However, a common problem is that these algorithms do not guarantee that the solution will coincide with the optimum and, hence, many solutions to real world OR-problems are afflicted with an uncertainty about the quality of the solution. The main aim of this thesis is to investigate the usability of statistical bounds to evaluate the quality of heuristic solutions applied to large combinatorial problems. The contributions of this thesis are both methodological and empirical. From a methodological point of view, the usefulness of statistical bounds on p-median problems is thoroughly investigated. The statistical bounds have good performance in providing informative quality assessment under appropriate parameter settings. Also, they outperform the commonly used Lagrangian bounds. It is demonstrated that the statistical bounds are shown to be comparable with the deterministic bounds in quadratic assignment problems. As to empirical research, environment pollution has become a worldwide problem, and transportation can cause a great amount of pollution. A new method for calculating and comparing the CO2-emissions of online and brick-and-mortar retailing is proposed. It leads to the conclusion that online retailing has significantly lesser CO2-emissions. Another problem is that the Swedish regional division is under revision and the border effect to public service accessibility is concerned of both residents and politicians. After analysis, it is shown that borders hinder the optimal location of public services and consequently the highest achievable economic and social utility may not be attained.
Resumo:
INTRODUCTION With the advent of Web 2.0, social networking websites like Facebook, MySpace and LinkedIn have become hugely popular. According to (Nilsen, 2009), social networking websites have global1 figures of almost 250 millions unique users among the top five2, with the time people spend on those networks increasing 63% between 2007 and 2008. Facebook alone saw a massive growth of 566% in number of minutes in the same period of time. Furthermore their appeal is clear, they enable users to easily form persistent networks of friends with whom they can interact and share content. Users then use those networks to keep in touch with their current friends and to reconnect with old friends. However, online social network services have rapidly evolved into highly complex systems which contain a large amount of personally salient information derived from large networks of friends. Since that information varies from simple links to music, photos and videos, users not only have to deal with the huge amount of data generated by them and their friends but also with the fact that it‟s composed of many different media forms. Users are presented with increasing challenges, especially as the number of friends on Facebook rises. An example of a problem is when a user performs a simple task like finding a specific friend in a group of 100 or more friends. In that case he would most likely have to go through several pages and make several clicks till he finds the one he is looking for. Another example is a user with more than 100 friends in which his friends make a status update or another action per day, resulting in 10 updates per hour to keep up. That is plausible, especially since the change in direction of Facebook to rival with Twitter, by encouraging users to update their status as they do on Twitter. As a result, to better present the web of information connected to a user the use of better visualizations is essential. The visualizations used nowadays on social networking sites haven‟t gone through major changes during their lifetimes. They have added more functionality and gave more tools to their users, but still the core of their visualization hasn‟t changed. The information is still presented in a flat way in lists/groups of text and images which can‟t show the extra connections pieces of information. Those extra connections can give new meaning and insights to the user, allowing him to more easily see if that content is important to him and the information related to it. However showing extra connections of information but still allowing the user to easily navigate through it and get the needed information with a quick glance is difficult. The use of color coding, clusters and shapes becomes then essential to attain that objective. But taking into consideration the advances in computer hardware in the last decade and the software platforms available today, there is the opportunity to take advantage of 3D. That opportunity comes in because we are at a phase were the hardware and the software available is ready for the use of 3D in the web. With the use of the extra dimension brought by 3D, visualizations can be constructed to show the content and its related information to the user at the same screen and in a clear way. Also it would allow a great deal of interactivity. Another opportunity to create better information‟s visualization presents itself in the form of the open APIs, specifically the ones made available by the social networking sites. Those APIs allow any developers to create their own applications or sites taking advantage of the huge amount of information there is on those networks. Specifically to this case, they open the door for the creation of new social network visualizations. Nevertheless, the third dimension is by itself not enough to create a better interface for a social networking website, there are some challenges to overcome. One of those challenges is to make the user understand what the system is doing during the interaction with the user. Even though that is important in 2D visualizations, it becomes essential in 3D due to the extra dimension. To overcome that challenge it‟s necessary the use of the principles of animations defined by the artists at Walt Disney Studios (Johnston, et al., 1995). By applying those principles in the development of the interface, the actions of the system in response to the user inputs became clear and understandable. Furthermore, a user study needs to be performed so the users‟ main goals and motivations, while navigating the social network, are revealed. Their goals and motivations are important in the construction of an interface that reflects the user expectations for the interface, but also helps in the development of appropriate metaphors. Those metaphors have an important role in the interface, because if correctly chosen they help the user understand the elements of the interface instead of making him memorize it. The last challenge is the use of 3D visualization on the web, since there have been several attempts to bring 3D into it, mainly with the various versions of VRML which were destined to failure due to the hardware limitations at the time. However, in the last couple of years there has been a movement to make the necessary tools to finally allow developers to use 3D in a useful way, using X3D or OpenGL but especially flash. This thesis argues that there is a need for a better social network visualization that shows all the dimensions of the information connected to the user and that allows him to move through it. But there are several characteristics the new visualization has to possess in order for it to present a real gain in usability to Facebook‟s users. The first quality is to have the friends at the core of its design, and the second to make use of the metaphor of circles of friends to separate users in groups taking into consideration the order of friendship. To achieve that several methods have to be used, from the use of 3D to get an extra dimension for presenting relevant information, to the use of direct manipulation to make the interface comprehensible, predictable and controllable. Moreover animation has to be use to make all the action on the screen perceptible to the user. Additionally, with the opportunity given by the 3D enabled hardware, the flash platform, through the use of the flash engine Papervision3D and the Facebook platform, all is in place to make the visualization possible. But even though it‟s all in place, there are challenges to overcome like making the system actions in 3D understandable to the user and creating correct metaphors that would allow the user to understand the information and options available to him. This thesis document is divided in six chapters, with Chapter 2 reviewing the literature relevant to the work described in this thesis. In Chapter 3 the design stage that resulted in the application presented in this thesis is described. In Chapter 4, the development stage, describing the architecture and the components that compose the application. In Chapter 5 the usability test process is explained and the results obtained through it are presented and analyzed. To finish, Chapter 6 presents the conclusions that were arrived in this thesis.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Service provisioning is a challenging research area for the design and implementation of autonomic service-oriented software systems. It includes automated QoS management for such systems and their applications. Monitoring, Diagnosis and Repair are three key features of QoS management. This work presents a self-healing Web service-based framework that manages QoS degradation at runtime. Our approach is based on proxies. Proxies act on meta-level communications and extend the HTTP envelope of the exchanged messages with QoS-related parameter values. QoS Data are filtered over time and analysed using statistical functions and the Hidden Markov Model. Detected QoS degradations are handled with proxies. We experienced our framework using an orchestrated electronic shop application (FoodShop).
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
It is well known that glucocorticoids induce peripheral insulin resistance in rodents and humans. Here, we investigated the structural and ultrastructural modifications, as well as the proteins involved in beta-cell function and proliferation, in islets from insulin-resistant rats. Adult male Wistar rats were made insulin resistant by daily administration of dexamethasone (DEX; 1mg/kg, i.p.) for five consecutive days, whilst control (CTL) rats received saline alone. Structure analyses showed a marked hypertrophy of DEX islets with an increase of 1.7-fold in islet mass and of 1.6-fold in islet density compared with CTL islets (P < 0.05). Ultrastructural evaluation of islets revealed an increased amount of secreting organelles, such as endoplasmic reticulum and Golgi apparatus in DEX islets. Mitotic figures were observed in DEX islets at structural and ultrastructural levels. Beta-cell proliferation, evaluated at the immunohistochemical level using anti-PCNA (proliferating cell nuclear antigen), showed an increase in pancreatic beta-cell proliferation of 6.4-fold in DEX islets compared with CTL islets (P < 0.0001). Increases in insulin receptor substrate-2 (IRS-2), phosphorylated-serine-threonine kinase AKT (p-AKT), cyclin D(2) and a decrease in retinoblastoma protein (pRb) levels were observed in DEX islets compared with CTL islets (P < 0.05). Therefore, during the development of insulin resistance, the endocrine pancreas adapts itself increasing beta-cell mass and proliferation, resulting in an amelioration of the functions. The potential mechanisms that underlie these events involve the activation of the IRS-2/AKT pathway and activation of the cell cycle, mediated by cyclin D(2). These adaptations permit the maintenance of glycaemia at near-physiological ranges.