965 resultados para Content-sensitive services


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Even though the use of recommender systems is already widely spread in several application areas, there is still a lack of studies for accessibility research field. One of these attempts to use recommender system benefits for accessibility needs is Vulcanus. The Vulcanus recommender system uses similarity analysis to compare user’s trails. In this way, it is possible to take advantage of the user’s past behavior and distribute personalized content and services. The Vulcanus combined concepts from ubiquitous computing, such as user profiles, context awareness, trails management, and similarity analysis. It uses two different approaches for trails similarity analysis: resources patterns and categories patterns. In this work we performed an asymptotic analysis, identifying Vulcanus’ algorithm complexity. Furthermore we also propose improvements achieved by dynamic programming technique, so the ordinary case is improved by using a bottom-up approach. With that approach, many unnecessary comparisons can be skipped and now Vulcanus 2.0 is presented with improvements in its average case scenario.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Big Data presents many challenges related to volume, whether one is interested in studying past datasets or, even more problematically, attempting to work with live streams of data. The most obvious challenge, in a ‘noisy’ environment such as contemporary social media, is to collect the pertinent information; be that information for a specific study, tweets which can inform emergency services or other responders to an ongoing crisis, or give an advantage to those involved in prediction markets. Often, such a process is iterative, with keywords and hashtags changing with the passage of time, and both collection and analytic methodologies need to be continually adapted to respond to this changing information. While many of the data sets collected and analyzed are preformed, that is they are built around a particular keyword, hashtag, or set of authors, they still contain a large volume of information, much of which is unnecessary for the current purpose and/or potentially useful for future projects. Accordingly, this panel considers methods for separating and combining data to optimize big data research and report findings to stakeholders. The first paper considers possible coding mechanisms for incoming tweets during a crisis, taking a large stream of incoming tweets and selecting which of those need to be immediately placed in front of responders, for manual filtering and possible action. The paper suggests two solutions for this, content analysis and user profiling. In the former case, aspects of the tweet are assigned a score to assess its likely relationship to the topic at hand, and the urgency of the information, whilst the latter attempts to identify those users who are either serving as amplifiers of information or are known as an authoritative source. Through these techniques, the information contained in a large dataset could be filtered down to match the expected capacity of emergency responders, and knowledge as to the core keywords or hashtags relating to the current event is constantly refined for future data collection. The second paper is also concerned with identifying significant tweets, but in this case tweets relevant to particular prediction market; tennis betting. As increasing numbers of professional sports men and women create Twitter accounts to communicate with their fans, information is being shared regarding injuries, form and emotions which have the potential to impact on future results. As has already been demonstrated with leading US sports, such information is extremely valuable. Tennis, as with American Football (NFL) and Baseball (MLB) has paid subscription services which manually filter incoming news sources, including tweets, for information valuable to gamblers, gambling operators, and fantasy sports players. However, whilst such services are still niche operations, much of the value of information is lost by the time it reaches one of these services. The paper thus considers how information could be filtered from twitter user lists and hash tag or keyword monitoring, assessing the value of the source, information, and the prediction markets to which it may relate. The third paper examines methods for collecting Twitter data and following changes in an ongoing, dynamic social movement, such as the Occupy Wall Street movement. It involves the development of technical infrastructure to collect and make the tweets available for exploration and analysis. A strategy to respond to changes in the social movement is also required or the resulting tweets will only reflect the discussions and strategies the movement used at the time the keyword list is created — in a way, keyword creation is part strategy and part art. In this paper we describe strategies for the creation of a social media archive, specifically tweets related to the Occupy Wall Street movement, and methods for continuing to adapt data collection strategies as the movement’s presence in Twitter changes over time. We also discuss the opportunities and methods to extract data smaller slices of data from an archive of social media data to support a multitude of research projects in multiple fields of study. The common theme amongst these papers is that of constructing a data set, filtering it for a specific purpose, and then using the resulting information to aid in future data collection. The intention is that through the papers presented, and subsequent discussion, the panel will inform the wider research community not only on the objectives and limitations of data collection, live analytics, and filtering, but also on current and in-development methodologies that could be adopted by those working with such datasets, and how such approaches could be customized depending on the project stakeholders.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Quantitative determinations of the hydrogen content and its profile in silicon nitride sensitive films by the method of resonant nuclear reaction have been carried out. At a deposition temperature of 825-degrees-C, hydrogen exists in an LPCVD silicon nitride sensitive film and the hydrogen content on its surface is in the range (8-16) x 10(21) cm-3, depending on the different deposition processes used. This hydrogen content is larger than the (2-3) x 10(21) cm-3 in its interior part, which is homogeneous. Meanwhile, we observe separate peaks for the chemical bonding configurations of Si-H and N-H bonds, indicated by the infrared absorption bands Si-O (1106 cm-1), N-H (1200 cm-1), Si-H-3 (2258 cm-1) and N-H-2 (3349 cm-1), respectively. The worse linear range of the ISFET is caused by the presence of oxygen on the surface of the silicon nitride sensitive film. The existence of chemical bonding configurations of Si-H, N-H and N-Si on its surfaces is favourable for its pH response.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Aflatoxin B1 (AFB1), ochratoxin A (OTA) and fumonisin B1 (FB1) are important mycotoxins in terms of
human exposure via food, their toxicity and regulatory limits that exist worldwide. Mixtures of toxins can frequently be present in foods, however due to the complications of determining their combined toxicity,
legal limits of exposure are determined for single compounds, based on long standing toxicological
techniques. High content analysis (HCA) may be a useful tool to determine total toxicity of complex
mixtures of mycotoxins. Endpoints including cell number (CN), nuclear intensity (NI), nuclear area (NA),
plasma membrane permeability (PMP), mitochondrial membrane potential (MMP) and mitochondrial
mass (MM) were compared to the conventional 3-(4,5-dimethylthiazol-2-yl)-2,5 diphenyltetrazolium
bromide (MTT) and neutral red (NR) endpoints in MDBK cells. Individual concentrations of each
mycotoxin (OTA 3mg/ml, FB1 8mg/ml and AFB11.28mg/ml) revealed no cytotoxicity with MTTor NR but
HCA showed significant cytotoxic effects up to 41.6% (p0.001) and 10.1% (p0.05) for OTA and AFB1,
respectively. The tertiary mixture (OTA 3mg/ml, FB1 8mg/ml and AFB1 1.28mg/ml) detected up to 37.3%
and 49.8% more cytotoxicity using HCA over MTT and NR, respectively. Whilst binary combinations of
OTA (3mg/ml) and FB1 (8mg/ml) revealed synergistic interactions using HCA (MMP, MM, NI endpoints)
not detected using MTT or NR. HCA is a highly novel and sensitive tool that could substantially help
determine future regulatory limits, for single and combined toxins present in food, ensuring legislation is based on true risks to human health exposure.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Problématique : Les organisations de santé font face à des pressions diverses pour offrir des soins et des services qui répondent aux plus hauts critères de performance et rendre compte de cette performance. Ces pressions proviennent de différents acteurs tels que les usagers du système de santé et les décideurs politiques. En raison de la place importante que prennent les infirmières dans l’offre de services de santé, il existe un intérêt croissant pour la mise en place d’interventions visant à mesurer et à améliorer la performance des services infirmiers. Cependant, dans le cadre de ces processus, les organisations sont souvent confrontées à des visions différentes et conflictuelles de la performance et à diverses approches pour la mesurer. Objectifs : Cette étude qualitative exploratoire a pour but d’explorer les conceptions de la performance des membres de l’équipe d’encadrement impliqués dans la prestation des services infirmiers et des infirmières soignantes et d’examiner dans quelle mesure les conceptions de la performance des deux groupes d’acteurs correspondent ou entrent en conflit. Méthodologie : Des entrevues semi-dirigées ont été conduites auprès de cinq membres de l’équipe d’encadrement et de trois infirmières. Une analyse de contenu a été effectuée à la fois pour faire ressortir l’éventail des conceptions et celles qui sont les plus prépondérantes dans les discours. Le cadre de référence ayant guidé cette analyse est une adaptation du modèle conceptuel de Donabedian comprenant trois dimensions soit la structure, le processus et les résultats (Unruh & Wan, 2004). Résultats : L’analyse des données recueillies auprès des membres de l’équipe d’encadrement fait ressortir dix conceptions distinctes, mais interreliées de la performance qui mettent l’accent sur les éléments de processus de soins infirmiers et de résultats chez la clientèle. Concernant les infirmières, neuf conceptions ont été répertoriées et l’accent a été porté surtout sur les éléments concernant l’adéquation des ressources humaines infirmières et de processus de soins infirmiers. Certaines similitudes et différences ont été repérées entre les conceptions de ces deux groupes d’acteurs. Conclusion : Cette étude permet de mieux comprendre les conceptions de la performance des acteurs impliqués dans l’offre de services infirmiers. Le modèle intégrateur qui résulte de la combinaison de ces différentes conceptions offre un cadre utile pour guider la construction d’outils de mesure de performance directement en lien avec les soins infirmiers et répondre à la demande d’imputabilité par rapport à ces services.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Automatic indexing and retrieval of digital data poses major challenges. The main problem arises from the ever increasing mass of digital media and the lack of efficient methods for indexing and retrieval of such data based on the semantic content rather than keywords. To enable intelligent web interactions, or even web filtering, we need to be capable of interpreting the information base in an intelligent manner. For a number of years research has been ongoing in the field of ontological engineering with the aim of using ontologies to add such (meta) knowledge to information. In this paper, we describe the architecture of a system (Dynamic REtrieval Analysis and semantic metadata Management (DREAM)) designed to automatically and intelligently index huge repositories of special effects video clips, based on their semantic content, using a network of scalable ontologies to enable intelligent retrieval. The DREAM Demonstrator has been evaluated as deployed in the film post-production phase to support the process of storage, indexing and retrieval of large data sets of special effects video clips as an exemplar application domain. This paper provides its performance and usability results and highlights the scope for future enhancements of the DREAM architecture which has proven successful in its first and possibly most challenging proving ground, namely film production, where it is already in routine use within our test bed Partners' creative processes. (C) 2009 Published by Elsevier B.V.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: Personalised nutrition (PN) may provide major health benefits to consumers. A potential barrier to the uptake of PN is consumers’ reluctance to disclose sensitive information upon which PN is based. This study adopts the privacy calculus to explore how PN service attributes contribute to consumers’ privacy risk and personalisation benefit perceptions. Methods: Sixteen focus groups (n = 124) were held in 8 EU countries and discussed 9 PN services that differed in terms of personal information, communication channel, service provider, advice justification, scope, frequency, and customer lock-in. Transcripts were content analysed. Results: The personal information that underpinned PN contributed to both privacy risk perception and personalisation benefit perception. Disclosing information face-to-face mitigated the perception of privacy risk and amplified the perception of personalisation benefit. PN provided by a qualified expert and justified by scientific evidence increased participants’ value perception. Enhancing convenience, offering regular face-to face support, and employing customer lock-in strategies were perceived as beneficial. Conclusion: This study suggests that to encourage consumer adoption, PN has to account for face-to-face communication, expert advice providers, support, a lifestyle-change focus, and customised offers. The results provide an initial insight into service attributes that influence consumer adoption of PN.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Obtaining high quality content in an e-learning portal is critical to maximise the learning experience. For e-learning portals the content specialists are the publishers. Typically, publishers are nominated by portal administrators to make their content available to instructors. Instructors subsequently customise the portal by selecting content according to their requirements. The choice of content is limited to that provided by the publishers. This is a rigid system owing to the fact that instructors do not have access to an exhaustive range of content. We propose a system based on XML Web services which can be adopted by publishers in adherence with a number of emerging Web standards including SOAP and UDDI to disseminate their content. This system can be leveraged by portals to preview and subsequently acquire content which best suits the requirements as determined by instructors.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background The quality of support provided to people with disability who show challenging behaviour could be influenced by the quality of the behaviour support plans (BSPs) on which staff rely for direction. This study investigated the content validity of the Behaviour Support Plan Quality Evaluation tool (BSP-QEII), originally developed to guide the development of BSPs for children in school settings, and evaluated its application for use in accommodation and day-support services for adults with intellectual disability.

Method A three-round Delphi study involving a purposive sample of experienced behaviour support practitioners (n = 30) was conducted over an 8-week period. The analyses included deductive content analysis and descriptive statistics.

Results The 12 quality domains of the BSP-QEII were affirmed as valid for application in adult accommodation and day-support service settings. Two additional quality domains were suggested, relating to the provision of detailed background on the client and the need for plans to reflect contemporary service philosophy. Furthermore, the results suggest that some issues previously identified in the literature as being important for inclusion in BSPs might not currently be a priority for practitioners. These included: the importance of specifying replacement or alternative behaviours to be taught, descriptions of teaching strategies to be used, reinforcers, and the specification of objective goals against which to evaluate the success of the intervention programme.

Conclusions The BSP-QEII provides a potentially useful framework to guide and evaluate the development of BSPs in services for adults with intellectual disability. Further research is warranted to investigate why practitioners are potentially giving greater attention to some areas of intervention practice than others, even where research has demonstrated these others areas of practice could be important to achieving quality outcomes.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Governments have traditionally censored drug-related information, both in traditional media and, in recent years, in online media. We explore Internet content regulation from a drug-policy perspective by describing the likely impacts of censoring drug websites and the parallel growth in hidden Internet services. Australia proposes a compulsory Internet filtering regime that would block websites that ‘depict, express or otherwise deal with matters of… drug misuse or addiction’ and/or ‘promote, incite or instruct in matters of crime’. In this article, we present findings from a mixed-methods study of online drug discussion. Our research found that websites dealing with drugs, that would likely be blocked by the filter, in fact contributed positively to harm reduction. Such sites helped people access more comprehensive and relevant information than was available elsewhere. Blocking these websites would likely drive drug discussion underground at a time when corporate-controlled ‘walled gardens’ (e.g. Facebook) and proprietary operating systems on mobile devices may also limit open drug discussion. At the same time, hidden Internet services, such as Silk Road, have emerged that are not affected by Internet filtering. The inability for any government to regulate Tor websites and the crypto-currency Bitcoin poses a unique challenge to drug prohibition policies.
Read More: http://informahealthcare.com/doi/full/10.3109/09687637.2012.745828

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The development of broadband Internet connections has fostered new audiovisual media services and opened new possibilities for accessing broadcasts. The Internet retransmission case of TVCatchup before the CJEU was the first case concerning new technologies in the light of Art. 3.1. of the Information Society Directive. On the other side of the Atlantic the Aereo case reached the U.S. Supreme Court and challenged the interpretation of public performance rights. In both cases the recipients of the services could receive broadcast programs in a way alternative to traditional broadcasting channels including terrestrial broadcasting or cable transmission. The Aereo case raised the debate on the possible impact of the interpretation of copyright law in the context of the development of new technologies, particularly cloud based services. It is interesting to see whether any similar problems occur in the EU. The „umbrella” in the title refers to Art. 8 WCT, which covers digital and Internet transmission and constitutes the backrgound for the EU and the U.S. legal solutions. The article argues that no international standard for qualification of the discussed services exists.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The future Internet architecture aims to reformulate the way the content/service is requested to make it location-independent. Information-Centric Networking is a new network paradigm, which tries to achieve this goal by making content objects identified and requested by name instead of address. In this paper, we extend Information-Centric Networking architecture to support services in order to be requested and invoked by names. We present NextServe framework, which is a service framework with a human-readable self-explanatory naming scheme. NextServe is inspired by the object-oriented programming paradigm and is applicable with real-world scenarios.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

With the aim of analyzing their protective function against chilling-induced injury, the pools of glutathione and its precursors, cysteine (Cys) and gamma -glutamyl-Cys, were increased in the chilling-sensitive maize (Zea mays) inbred line Penjalinan using a combination of two herbicide safeners. Compared with the controls, the greatest increase in the pool size of the three thiols was detected in the shoots and roots when both safeners were applied at a concentration of 5 muM. This combination increased the relative protection from chilling from 50% to 75%. It is interesting that this increase in the total glutathione (TG) level was accompanied by a rise in glutathione reductase (GR; EC 1.6.4.2) activity. When the most effective safener combination was applied simultaneously with increasing concentrations of buthionine sulfoximine, a specific inhibitor of glutathione synthesis, the total gamma -glutamyl-Cys and TG contents and GR activity were decreased to very low levels and relative protection was lowered from 75% to 44%. During chilling, the ratio of reduced to oxidized thiols first decreased independently of the treatments, but increased again to the initial value in safener-treated seedlings after 7 d at 5 degreesC. Taking all results together resulted in a linear relationship between TG and GR and a biphasic relationship between relative protection and GR or TG, thus demonstrating the relevance of the glutathione levels in protecting maize against chilling-induced injury.