949 resultados para user generated content


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a fibre-based approach for generation of optical frequency combs (OFCs) with the aim of calibration of astronomical spectrographs in the low and medium-resolution range. This approach includes two steps: in the first step, an appropriate state of optical pulses is generated and subsequently moulded in the second step delivering the desired OFC. More precisely, the first step is realised by injection of two continuous-wave (CW) lasers into a conventional single-mode fibre, whereas the second step generates a broad OFC by using the optical solitons generated in step one as initial condition. We investigate the conversion of a bichromatic input wave produced by two initial CW lasers into a train of optical solitons, which happens in the fibre used as step one. Especially, we are interested in the soliton content of the pulses created in this fibre. For that, we study different initial conditions (a single cosine-hump, an Akhmediev breather, and a deeply modulated bichromatic wave) by means of soliton radiation beat analysis and compare the results to draw conclusion about the soliton content of the state generated in the first step. In case of a deeply modulated bichromatic wave, we observed the formation of a collective soliton crystal for low input powers and the appearance of separated solitons for high input powers. An intermediate state showing the features of both, the soliton crystal and the separated solitons, turned out to be most suitable for the generation of OFC for the purpose of calibration of astronomical spectrographs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A framework that aims to best utilize the mobile network resources for video applications is presented in this paper. The main contribution of the work proposed is the QoE-driven optimization method that can maintain a desired trade-off between fairness and efficiency in allocating resources in terms of data rates to video streaming users in LTE networks. This method is concerned with the control of the user satisfaction level from the service continuity's point of view and applies appropriate QoE metrics (Pause Intensity and variations) to determine the scheduling strategies in combination with the mechanisms used for adaptive video streaming such as 3GP/MPEG-DASH. The superiority of the proposed algorithms are demonstrated, showing how the resources of a mobile network can be optimally utilized by using quantifiable QoE measurements. This approach can also find the best match between demand and supply in the process of network resource distribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis addressed the problem of risk analysis in mental healthcare, with respect to the GRiST project at Aston University. That project provides a risk-screening tool based on the knowledge of 46 experts, captured as mind maps that describe relationships between risks and patterns of behavioural cues. Mind mapping, though, fails to impose control over content, and is not considered to formally represent knowledge. In contrast, this thesis treated GRiSTs mind maps as a rich knowledge base in need of refinement; that process drew on existing techniques for designing databases and knowledge bases. Identifying well-defined mind map concepts, though, was hindered by spelling mistakes, and by ambiguity and lack of coverage in the tools used for researching words. A novel use of the Edit Distance overcame those problems, by assessing similarities between mind map texts, and between spelling mistakes and suggested corrections. That algorithm further identified stems, the shortest text string found in related word-forms. As opposed to existing approaches’ reliance on built-in linguistic knowledge, this thesis devised a novel, more flexible text-based technique. An additional tool, Correspondence Analysis, found patterns in word usage that allowed machines to determine likely intended meanings for ambiguous words. Correspondence Analysis further produced clusters of related concepts, which in turn drove the automatic generation of novel mind maps. Such maps underpinned adjuncts to the mind mapping software used by GRiST; one such new facility generated novel mind maps, to reflect the collected expert knowledge on any specified concept. Mind maps from GRiST are stored as XML, which suggested storing them in an XML database. In fact, the entire approach here is ”XML-centric”, in that all stages rely on XML as far as possible. A XML-based query language allows user to retrieve information from the mind map knowledge base. The approach, it was concluded, will prove valuable to mind mapping in general, and to detecting patterns in any type of digital information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to determine if an experimental context-based delivery format for mathematics would be more effective than a traditional model for increasing the performance in mathematics of at-risk students in a public high school of choice, as evidenced by significant gains in achievement on the standards-based Mathematics subtest of the FCAT and final academic grades in Algebra I. The guiding rationale for this approach is captured in the Secretary's Commission on Achieving Necessary Skills (SCANS) report of 1992 that resulted in school-to-work initiatives (United States Department of Labor). Also, the charge for educational reform has been codified at the state level as Educational Accountability Act of 1971 (Florida Statutes, 1995) and at the national level as embodied in the No Child Left Behind Act of 2001. A particular focus of educational reform is low performing, at-risk students. ^ This dissertation explored the effects of a context-based curricular reform designed to enhance the content of Algebra I content utilizing a research design consisting of two delivery models: a traditional content-based course; and, a thematically structured, content-based course. In this case, the thematic element was business education as there are many advocates in career education who assert that this format engages students who are often otherwise disinterested in mathematics in a relevant, SCANS skills setting. The subjects in each supplementary course were ninth grade students who were both low performers in eighth grade mathematics and who had not passed the eighth grade administration of the standards-based FCAT Mathematics subtest. The sample size was limited to two groups of 25 students and two teachers. The site for this study was a public charter school. Student-generated performance data were analyzed using descriptive statistics. ^ Results indicated that contrary to the beliefs held by many, contextual presentation of content did not cause significant gains in either academic performance or test performance for those in the experimental treatment group. Further, results indicated that there was no meaningful difference in performance between the two groups. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rapid growth of the Internet and the advancements of the Web technologies have made it possible for users to have access to large amounts of on-line music data, including music acoustic signals, lyrics, style/mood labels, and user-assigned tags. The progress has made music listening more fun, but has raised an issue of how to organize this data, and more generally, how computer programs can assist users in their music experience. An important subject in computer-aided music listening is music retrieval, i.e., the issue of efficiently helping users in locating the music they are looking for. Traditionally, songs were organized in a hierarchical structure such as genre->artist->album->track, to facilitate the users’ navigation. However, the intentions of the users are often hard to be captured in such a simply organized structure. The users may want to listen to music of a particular mood, style or topic; and/or any songs similar to some given music samples. This motivated us to work on user-centric music retrieval system to improve users’ satisfaction with the system. The traditional music information retrieval research was mainly concerned with classification, clustering, identification, and similarity search of acoustic data of music by way of feature extraction algorithms and machine learning techniques. More recently the music information retrieval research has focused on utilizing other types of data, such as lyrics, user-access patterns, and user-defined tags, and on targeting non-genre categories for classification, such as mood labels and styles. This dissertation focused on investigating and developing effective data mining techniques for (1) organizing and annotating music data with styles, moods and user-assigned tags; (2) performing effective analysis of music data with features from diverse information sources; and (3) recommending music songs to the users utilizing both content features and user access patterns.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Elemental and isotopic composition of leaves of the seagrassThalassia testudinum was highly variable across the 10,000 km2 and 8 years of this study. The data reported herein expand the reported range in carbon:nitrogen (C:N) and carbon:phosphorus (C:P) ratios and δ13C and δ15N values reported for this species worldwide; 13.2–38.6 for C:N and 411–2,041 for C:P. The 981 determinations in this study generated a range of −13.5‰ to −5.2‰ for δ13C and −4.3‰ to 9.4‰ for δ15N. The elemental and isotope ratios displayed marked seasonality, and the seasonal patterns could be described with a simple sine wave model. C:N, C:P, δ13C, and δ15N values all had maxima in the summer and minima in the winter. Spatial patterns in the summer maxima of these quantities suggest there are large differences in the relative availability of N and P across the study area and that there are differences in the processing and the isotopic composition of C and N. This work calls into question the interpretation of studies about nutrient cycling and food webs in estuaries based on few samples collected at one time, since we document natural variability greater than the signal often used to imply changes in the structure or function of ecosystems. The data and patterns presented in this paper make it clear that there is no threshold δ15N value for marine plants that can be used as an unambiguous indicator of human sewage pollution without a thorough understanding of local temporal and spatial variability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The design of interfaces to facilitate user search has become critical for search engines, ecommercesites, and intranets. This study investigated the use of targeted instructional hints to improve search by measuring the quantitative effects of users' performance and satisfaction. The effects of syntactic, semantic and exemplar search hints on user behavior were evaluated in an empirical investigation using naturalistic scenarios. Combining the three search hint components, each with two levels of intensity, in a factorial design generated eight search engine interfaces. Eighty participants participated in the study and each completed six realistic search tasks. Results revealed that the inclusion of search hints improved user effectiveness, efficiency and confidence when using the search interfaces, but with complex interactions that require specific guidelines for search interface designers. These design guidelines will allow search designers to create more effective interfaces for a variety of searchapplications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Drilling a transect of holes across the Costa Rica forearc during ODP Leg 170 demonstrated the margin wedge to be of continental, non accretionary origin, which is intersected by permeable thrust faults. Pore waters from four drillholes, two of which penetrated the décollement zone and reached the underthrust lower plate sedimentary sequence of the Cocos Plate, were examined for boron contents and boron isotopic signatures. The combined results show dilution of the uppermost sedimentary cover of the forearc, with boron contents lower than half of the present-day seawater values. Pore fluid "refreshening" suggests that gas hydrate water has been mixed with the sediment interstitial water, without profoundly affecting the d11B values. Fault-related flux of a deeply generated fluid is inferred from high B concentration in the interval beneath the décollement, being released from the underthrust sequence with incipient burial. First-order fluid budget calculations over a cross-section across the Costa Rica forearc indicate that no significant fluid transfer from the lower to the upper plate is inferred from boron fluid profiles, at least within the frontal 40 km studied. Expulsed lower plate pore water, which is estimated to be 0.26-0.44 km3 per km trench, is conducted efficiently along and just beneath the décollement zone, indicating effective shear-enhanced compaction. In the upper plate forearc wedge, dewatering occurs as diffuse transport as well as channelled flow. A volume of approximately 2 km3 per km trench is expulsed due to compaction and, to a lesser extent, lateral shortening. Pore water chemistry is influenced by gas hydrate instability, so that it remains unknown whether deep processes like mineral dehydration or hydrocarbon formation may play a considerable role towards the hinterland.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El de hoy es un entorno en el que todos los días los medios sufren cambios estructurales y funcionales que los obligan a replantear su accionar y reinventar sus usos y esquemas de comunicación, de ahí la importancia de esta investigación mixta, cuantitativa y cualitativa, que recurrió a seguimiento en redes sociales y análisis de contenidos; detectando el cómo los medios responden al estar inmersos en un mundo en el que se pasó de una cultura de imprenta a una cultura de la pantalla donde surgen nuevas prácticas sociales que crean dispositivos a la medida de cada uno, elementos en los que conjugan el texto, el audio y el video configurando nuevos medios en los que se generan otras formas de interacción, y se replantea el quehacer profesional del Comunicador Social Periodista.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Advertising investment and audience figures indicate that television continues to lead as a mass advertising medium. However, its effectiveness is questioned due to problems such as zapping, saturation and audience fragmentation. This has favoured the development of non-conventional advertising formats. This study provides empirical evidence for the theoretical development. This investigation analyzes the recall generated by four non-conventional advertising formats in a real environment: short programme (branded content), television sponsorship, internal and external telepromotion versus the more conventional spot. The methodology employed has integrated secondary data with primary data from computer assisted telephone interviewing (CATI) were performed ad-hoc on a sample of 2000 individuals, aged 16 to 65, representative of the total television audience. Our findings show that non-conventional advertising formats are more effective at a cognitive level, as they generate higher levels of both unaided and aided recall, in all analyzed formats when compared to the spot.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Metagenomic studies use high-throughput sequence data to investigate microbial communities in situ. However, considerable challenges remain in the analysis of these data, particularly with regard to speed and reliable analysis of microbial species as opposed to higher level taxa such as phyla. We here present Genometa, a computationally undemanding graphical user interface program that enables identification of bacterial species and gene content from datasets generated by inexpensive high-throughput short read sequencing technologies. Our approach was first verified on two simulated metagenomic short read datasets, detecting 100% and 94% of the bacterial species included with few false positives or false negatives. Subsequent comparative benchmarking analysis against three popular metagenomic algorithms on an Illumina human gut dataset revealed Genometa to attribute the most reads to bacteria at species level (i.e. including all strains of that species) and demonstrate similar or better accuracy than the other programs. Lastly, speed was demonstrated to be many times that of BLAST due to the use of modern short read aligners. Our method is highly accurate if bacteria in the sample are represented by genomes in the reference sequence but cannot find species absent from the reference. This method is one of the most user-friendly and resource efficient approaches and is thus feasible for rapidly analysing millions of short reads on a personal computer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Authentication plays an important role in how we interact with computers, mobile devices, the web, etc. The idea of authentication is to uniquely identify a user before granting access to system privileges. For example, in recent years more corporate information and applications have been accessible via the Internet and Intranet. Many employees are working from remote locations and need access to secure corporate files. During this time, it is possible for malicious or unauthorized users to gain access to the system. For this reason, it is logical to have some mechanism in place to detect whether the logged-in user is the same user in control of the user's session. Therefore, highly secure authentication methods must be used. We posit that each of us is unique in our use of computer systems. It is this uniqueness that is leveraged to "continuously authenticate users" while they use web software. To monitor user behavior, n-gram models are used to capture user interactions with web-based software. This statistical language model essentially captures sequences and sub-sequences of user actions, their orderings, and temporal relationships that make them unique by providing a model of how each user typically behaves. Users are then continuously monitored during software operations. Large deviations from "normal behavior" can possibly indicate malicious or unintended behavior. This approach is implemented in a system called Intruder Detector (ID) that models user actions as embodied in web logs generated in response to a user's actions. User identification through web logs is cost-effective and non-intrusive. We perform experiments on a large fielded system with web logs of approximately 4000 users. For these experiments, we use two classification techniques; binary and multi-class classification. We evaluate model-specific differences of user behavior based on coarse-grain (i.e., role) and fine-grain (i.e., individual) analysis. A specific set of metrics are used to provide valuable insight into how each model performs. Intruder Detector achieves accurate results when identifying legitimate users and user types. This tool is also able to detect outliers in role-based user behavior with optimal performance. In addition to web applications, this continuous monitoring technique can be used with other user-based systems such as mobile devices and the analysis of network traffic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern software application testing, such as the testing of software driven by graphical user interfaces (GUIs) or leveraging event-driven architectures in general, requires paying careful attention to context. Model-based testing (MBT) approaches first acquire a model of an application, then use the model to construct test cases covering relevant contexts. A major shortcoming of state-of-the-art automated model-based testing is that many test cases proposed by the model are not actually executable. These \textit{infeasible} test cases threaten the integrity of the entire model-based suite, and any coverage of contexts the suite aims to provide. In this research, I develop and evaluate a novel approach for classifying the feasibility of test cases. I identify a set of pertinent features for the classifier, and develop novel methods for extracting these features from the outputs of MBT tools. I use a supervised logistic regression approach to obtain a model of test case feasibility from a randomly selected training suite of test cases. I evaluate this approach with a set of experiments. The outcomes of this investigation are as follows: I confirm that infeasibility is prevalent in MBT, even for test suites designed to cover a relatively small number of unique contexts. I confirm that the frequency of infeasibility varies widely across applications. I develop and train a binary classifier for feasibility with average overall error, false positive, and false negative rates under 5\%. I find that unique event IDs are key features of the feasibility classifier, while model-specific event types are not. I construct three types of features from the event IDs associated with test cases, and evaluate the relative effectiveness of each within the classifier. To support this study, I also develop a number of tools and infrastructure components for scalable execution of automated jobs, which use state-of-the-art container and continuous integration technologies to enable parallel test execution and the persistence of all experimental artifacts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Le béton conventionnel (BC) a de nombreux problèmes tels que la corrosion de l’acier d'armature et les faibles résistances des constructions en béton. Par conséquent, la plupart des structures fabriquées avec du BC exigent une maintenance fréquent. Le béton fibré à ultra-hautes performances (BFUP) peut être conçu pour éliminer certaines des faiblesses caractéristiques du BC. Le BFUP est défini à travers le monde comme un béton ayant des propriétés mécaniques, de ductilité et de durabilité supérieures. Le BFUP classique comprend entre 800 kg/m³ et 1000 kg/m³ de ciment, de 25 à 35% massique (%m) de fumée de silice (FS), de 0 à 40%m de poudre de quartz (PQ) et 110-140%m de sable de quartz (SQ) (les pourcentages massiques sont basés sur la masse totale en ciment des mélanges). Le BFUP contient des fibres d'acier pour améliorer sa ductilité et sa résistance aux efforts de traction. Les quantités importantes de ciment utilisées pour produire un BFUP affectent non seulement les coûts de production et la consommation de ressources naturelles comme le calcaire, l'argile, le charbon et l'énergie électrique, mais affectent également négativement les dommages sur l'environnement en raison de la production substantielle de gaz à effet de serre dont le gas carbonique (CO[indice inférieur 2]). Par ailleurs, la distribution granulométrique du ciment présente des vides microscopiques qui peuvent être remplis avec des matières plus fines telles que la FS. Par contre, une grande quantité de FS est nécessaire pour combler ces vides uniquement avec de la FS (25 à 30%m du ciment) ce qui engendre des coûts élevés puisqu’il s’agit d’une ressource limitée. Aussi, la FS diminue de manière significative l’ouvrabilité des BFUP en raison de sa surface spécifique Blaine élevée. L’utilisation du PQ et du SQ est également coûteuse et consomme des ressources naturelles importantes. D’ailleurs, les PQ et SQ sont considérés comme des obstacles pour l’utilisation des BFUP à grande échelle dans le marché du béton, car ils ne parviennent pas à satisfaire les exigences environnementales. D’ailleurs, un rapport d'Environnement Canada stipule que le quartz provoque des dommages environnementaux immédiats et à long terme en raison de son effet biologique. Le BFUP est généralement vendu sur le marché comme un produit préemballé, ce qui limite les modifications de conception par l'utilisateur. Il est normalement transporté sur de longues distances, contrairement aux composantes des BC. Ceci contribue également à la génération de gaz à effet de serre et conduit à un coût plus élevé du produit final. Par conséquent, il existe le besoin de développer d’autres matériaux disponibles localement ayant des fonctions similaires pour remplacer partiellement ou totalement la fumée de silice, le sable de quartz ou la poudre de quartz, et donc de réduire la teneur en ciment dans BFUP, tout en ayant des propriétés comparables ou meilleures. De grandes quantités de déchets verre ne peuvent pas être recyclées en raison de leur fragilité, de leur couleur, ou des coûts élevés de recyclage. La plupart des déchets de verre vont dans les sites d'enfouissement, ce qui est indésirable puisqu’il s’agit d’un matériau non biodégradable et donc moins respectueux de l'environnement. Au cours des dernières années, des études ont été réalisées afin d’utiliser des déchets de verre comme ajout cimentaire alternatif (ACA) ou comme granulats ultrafins dans le béton, en fonction de la distribution granulométrique et de la composition chimique de ceux-ci. Cette thèse présente un nouveau type de béton écologique à base de déchets de verre à ultra-hautes performances (BEVUP) développé à l'Université de Sherbrooke. Les bétons ont été conçus à l’aide de déchets verre de particules de tailles variées et de l’optimisation granulaire de la des matrices granulaires et cimentaires. Les BEVUP peuvent être conçus avec une quantité réduite de ciment (400 à 800 kg/m³), de FS (50 à 220 kg/m³), de PQ (0 à 400 kg/m³), et de SQ (0-1200 kg/m³), tout en intégrant divers produits de déchets de verre: du sable de verre (SV) (0-1200 kg/m³) ayant un diamètre moyen (d[indice inférieur 50]) de 275 µm, une grande quantité de poudre de verre (PV) (200-700 kg/m³) ayant un d50 de 11 µm, une teneur modérée de poudre de verre fine (PVF) (50-200 kg/m³) avec d[indice inférieur] 50 de 3,8 µm. Le BEVUP contient également des fibres d'acier (pour augmenter la résistance à la traction et améliorer la ductilité), du superplastifiants (10-60 kg/m³) ainsi qu’un rapport eau-liant (E/L) aussi bas que celui de BFUP. Le remplacement du ciment et des particules de FS avec des particules de verre non-absorbantes et lisse améliore la rhéologie des BEVUP. De plus, l’utilisation de la PVF en remplacement de la FS réduit la surface spécifique totale nette d’un mélange de FS et de PVF. Puisque la surface spécifique nette des particules diminue, la quantité d’eau nécessaire pour lubrifier les surfaces des particules est moindre, ce qui permet d’obtenir un affaissement supérieur pour un même E/L. Aussi, l'utilisation de déchets de verre dans le béton abaisse la chaleur cumulative d'hydratation, ce qui contribue à minimiser le retrait de fissuration potentiel. En fonction de la composition des BEVUP et de la température de cure, ce type de béton peut atteindre des résistances à la compression allant de 130 à 230 MPa, des résistances à la flexion supérieures à 20 MPa, des résistances à la traction supérieure à 10 MPa et un module d'élasticité supérieur à 40 GPa. Les performances mécaniques de BEVUP sont améliorées grâce à la réactivité du verre amorphe, à l'optimisation granulométrique et la densification des mélanges. Les produits de déchets de verre dans les BEVUP ont un comportement pouzzolanique et réagissent avec la portlandite générée par l'hydratation du ciment. Cependant, ceci n’est pas le cas avec le sable de quartz ni la poudre de quartz dans le BFUP classique, qui réagissent à la température élevée de 400 °C. L'addition des déchets de verre améliore la densification de l'interface entre les particules. Les particules de déchets de verre ont une grande rigidité, ce qui augmente le module d'élasticité du béton. Le BEVUP a également une très bonne durabilité. Sa porosité capillaire est très faible, et le matériau est extrêmement résistant à la pénétration d’ions chlorure (≈ 8 coulombs). Sa résistance à l'abrasion (indice de pertes volumiques) est inférieure à 1,3. Le BEVUP ne subit pratiquement aucune détérioration aux cycles de gel-dégel, même après 1000 cycles. Après une évaluation des BEVUP en laboratoire, une mise à l'échelle a été réalisée avec un malaxeur de béton industriel et une validation en chantier avec de la construction de deux passerelles. Les propriétés mécaniques supérieures des BEVUP a permis de concevoir les passerelles avec des sections réduites d’environ de 60% par rapport aux sections faites de BC. Le BEVUP offre plusieurs avantages économiques et environnementaux. Il réduit le coût de production et l’empreinte carbone des structures construites de béton fibré à ultra-hautes performances (BFUP) classique, en utilisant des matériaux disponibles localement. Il réduit les émissions de CO[indice inférieur 2] associées à la production de clinkers de ciment (50% de remplacement du ciment) et utilise efficacement les ressources naturelles. De plus, la production de BEVUP permet de réduire les quantités de déchets de verre stockés ou mis en décharge qui causent des problèmes environnementaux et pourrait permettre de sauver des millions de dollars qui pourraient être dépensés dans le traitement de ces déchets. Enfin, il offre une solution alternative aux entreprises de construction dans la production de BFUP à moindre coût.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the last decade, success of social networks has significantly reshaped how people consume information. Recommendation of contents based on user profiles is well-received. However, as users become dominantly mobile, little is done to consider the impacts of the wireless environment, especially the capacity constraints and changing channel. In this dissertation, we investigate a centralized wireless content delivery system, aiming to optimize overall user experience given the capacity constraints of the wireless networks, by deciding what contents to deliver, when and how. We propose a scheduling framework that incorporates content-based reward and deliverability. Our approach utilizes the broadcast nature of wireless communication and social nature of content, by multicasting and precaching. Results indicate this novel joint optimization approach outperforms existing layered systems that separate recommendation and delivery, especially when the wireless network is operating at maximum capacity. Utilizing limited number of transmission modes, we significantly reduce the complexity of the optimization. We also introduce the design of a hybrid system to handle transmissions for both system recommended contents ('push') and active user requests ('pull'). Further, we extend the joint optimization framework to the wireless infrastructure with multiple base stations. The problem becomes much harder in that there are many more system configurations, including but not limited to power allocation and how resources are shared among the base stations ('out-of-band' in which base stations transmit with dedicated spectrum resources, thus no interference; and 'in-band' in which they share the spectrum and need to mitigate interference). We propose a scalable two-phase scheduling framework: 1) each base station obtains delivery decisions and resource allocation individually; 2) the system consolidates the decisions and allocations, reducing redundant transmissions. Additionally, if the social network applications could provide the predictions of how the social contents disseminate, the wireless networks could schedule the transmissions accordingly and significantly improve the dissemination performance by reducing the delivery delay. We propose a novel method utilizing: 1) hybrid systems to handle active disseminating requests; and 2) predictions of dissemination dynamics from the social network applications. This method could mitigate the performance degradation for content dissemination due to wireless delivery delay. Results indicate that our proposed system design is both efficient and easy to implement.