905 resultados para Web Mining, Data Mining, User Topic Model, Web User Profiles
Resumo:
The report presents the results of the CTD measurements carried out in the Bellingshausen Sea - an area rare of CTD measurements. The main part of the report consists of the brief description of the CTD data acquisition and processing routines, the vertical profiles of temperature, salinity and density, and of the plots of the distribution of these properties along the hydrographic sections. The final part of the report deals with the notably similar structure of the vertical density distribution at different locations if presented as a function of a non dimensional vertical co-ordinate. It is pointed out that such a distribution could be an asymptotic limit of stationary mixing along neutral surfaces.
Resumo:
Over 300 surface sediment samples from the Central and South Atlantic Ocean and the Caribbean Sea were investigated for the preservation state of the aragonitic test of Limacina inflata. Results are displayed in spatial distribution maps and are plotted against cross-sections of vertical water mass configurations, illustrating the relationship between preservation state, saturation state of the overlying waters, and overall water mass distribution. The microscopic investigation of L. inflata (adults) yielded the Limacina dissolution index (LDX), and revealed three regional dissolution patterns. In the western Atlantic Ocean, sedimentary preservation states correspond to saturation states in the overlying waters. Poor preservation is found within intermediate water masses of southern origin (i.e. Antarctic intermediate water (AAIW), upper circumpolar water (UCDW)), which are distinctly aragonite-corrosive, whereas good preservation is observed within the surface waters above and within the upper North Atlantic deep water (UNADW) beneath the AAIW. In the eastern Atlantic Ocean, in particular along the African continental margin, the LDX fails in most cases (i.e. less than 10 tests of L. inflata per sample were found). This is most probably due to extensive "metabolic" aragonite dissolution at the sediment-water interface combined with a reduced abundance of L. inflata in the surface waters. In the Caribbean Sea, a more complex preservation pattern is observed because of the interaction between different water masses, which invade the Caribbean basins through several channels, and varying input of bank-derived fine aragonite and magnesian calcite material. The solubility of aragonite increases with increasing pressure, but aragonite dissolution in the sediments does not simply increase with water depth. Worse preservation is found in intermediate water depths following an S-shaped curve. As a result, two aragonite lysoclines are observed, one above the other. In four depth transects, we show that the western Atlantic and Caribbean LDX records resemble surficial calcium carbonate data and delta13C and carbonate ion concentration profiles in the water column. Moreover, preservation of L. inflata within AAIW and UCDW improves significantly to the north, whereas carbonate corrosiveness diminishes due to increased mixing of AAIW and UNADW. The close relationship between LDX values and aragonite contents in the sediments shows much promise for the quantification of the aragonite loss under the influence of different water masses. LDX failure and uncertainties may be attributed to (1) aragonite dissolution due to bottom water corrosiveness, (2) aragonite dissolution due to additional CO2 release into the bottom water by the degradation of organic matter based on an enhanced supply of organic matter into the sediment, (3) variations in the distribution of L. inflata and hence a lack of supply into the sediment, (4) dilution of the sediments and hence a lack of tests of L. inflata, or (5) redeposition of sediment particles.
Resumo:
The chromodomain is 40-50 amino acids in length and is conserved in a wide range of chromatic and regulatory proteins involved in chromatin remodeling. Chromodomain-containing proteins can be classified into families based on their broader characteristics, in particular the presence of other types of domains, and which correlate with different subclasses of the chromodomains themselves. Hidden Markov model (HMM)-generated profiles of different subclasses of chromodomains were used here to identify sequences encoding chromodomain-containing proteins in the mouse transcriptome and genome. A total of 36 different loci encoding proteins containing chromodomains, including 17 novel loci, were identified. Six of these loci (including three apparent pseudogenes, a novel HP1 ortholog, and two novel Msl-3 transcription factor-like proteins) are not present in the human genome, whereas the human genome contains four loci (two CDY orthologs and two apparent CDY pseuclogenes) that are not present in mouse. A number of these loci exhibit alternative splicing to produce different isoforms, including 43 novel variants, some of which lack the chromodomain. The likely functions of these proteins are discussed in relation to the known functions of other chromodomain-containing proteins within the same family.
Resumo:
In modern magnetic resonance imaging (MRI), both patients and radiologists are exposed to strong, nonuniform static magnetic fields inside or outside of the scanner, in which the body movement may be able to induce electric currents in tissues which could be possibly harmful. This paper presents theoretical investigations into the spatial distribution of induced E-fields in the human model when moving at various positions around the magnet. The numerical calculations are based on an efficient, quasistatic, finite-difference scheme and an anatomically realistic, full-body, male model. 3D field profiles from an actively-shielded 4 T magnet system are used and the body model projected through the field profile with normalized velocity. The simulation shows that it is possible to induce E-fields/currents near the level of physiological significance under some circumstances and provides insight into the spatial characteristics of the induced fields. The results are easy to extrapolate to very high field strengths for the safety evaluation at a variety of field strengths and motion velocities.
Resumo:
High-performance liquid chromatographic methods are developed for the simultaneous determination of various salicylates, their p-hydroxy isomers and nicotinic acid esters. The method is sensitive enough to detect trace amounts (~µM/L)of the product generated from cross reactivity between the drugs and the vehicle. The developed method also allows analysis of various topical products containing salicylate and nicotinate esters in their formulations. Applying this method, the degradation profiles of salicylates, nicotinates, p-hydroxy benzoate, o-methoxy benzoate and aspirin prodrugs in alkaline media are determined. The profile for alkyl salicylate degradation is found to be first order (A---? B) When the alcoholic radical is similar to that of the ester. In alcohol having a radical different from that of the ester function, the degradation is found to proceed through competitive transesterification and hydrolysis. The intermediates are identified following synthesis and isolation. The rate and extent of transesterification depends on the proportion of alcohol present in the system. Equations are presented to model the time profiles of reactant and product concentration. The reactions are base catalysed and the predominant pathway involves a concerted solvent attack upon the salicylate anion. Competitive hydrolysis of both ester components also follows this mechanism at moderate pH values but rates increase under strongly alkaline conditions as direct hydroxide attack becomes significant. In contrast, transesterification is independent of base concentration once full ionization is accomplished. The competitive hydrolysis is modelled using equations involving the dielectric constant of the medium. A range of other esters are also shown to undergo base-catalysed transesterification. In non-alcoholic solution phenyl salicylate undergoes a concentration-dependent oligomerisation which yields salsalate among the products. Competitive transesterification and hydrolysis also occur in products for topical use which have vehicles based upon alcohol, glycol or glycol polymers. Such reactions may compromise stability assessments, pharmaceutical integrity and delivery profiles.
Resumo:
Traditional content-based filtering methods usually utilize text extraction and classification techniques for building user profiles as well as for representations of contents, i.e. item profiles. These methods have some disadvantages e.g. mismatch between user profile terms and item profile terms, leading to low performance. Some of the disadvantages can be overcome by incorporating a common ontology which enables representing both the users' and the items' profiles with concepts taken from the same vocabulary. We propose a new content-based method for filtering and ranking the relevancy of items for users, which utilizes a hierarchical ontology. The method measures the similarity of the user's profile to the items' profiles, considering the existing of mutual concepts in the two profiles, as well as the existence of "related" concepts, according to their position in the ontology. The proposed filtering algorithm computes the similarity between the users' profiles and the items' profiles, and rank-orders the relevant items according to their relevancy to each user. The method is being implemented in ePaper, a personalized electronic newspaper project, utilizing a hierarchical ontology designed specifically for classification of News items. It can, however, be utilized in other domains and extended to other ontologies.
Resumo:
People recommenders are a widespread feature of social networking sites and educational social learning platforms alike. However, when these systems are used to extend learners’ Personal Learning Networks, they often fall short of providing recommendations of learning value to their users. This paper proposes a design of a people recommender based on content-based user profiles, and a matching method based on dissimilarity therein. It presents the results of an experiment conducted with curators of the content curation site Scoop.it!, where curators rated personalized recommendations for contacts. The study showed that matching dissimilarity of interpretations of shared interests is more successful in providing positive experiences of breakdown for the curator than is matching on similarity. The main conclusion of this paper is that people recommenders should aim to trigger constructive experiences of breakdown for their users, as the prospect and potential of such experiences encourage learners to connect to their recommended peers.
Resumo:
Even though the use of recommender systems is already widely spread in several application areas, there is still a lack of studies for accessibility research field. One of these attempts to use recommender system benefits for accessibility needs is Vulcanus. The Vulcanus recommender system uses similarity analysis to compare user’s trails. In this way, it is possible to take advantage of the user’s past behavior and distribute personalized content and services. The Vulcanus combined concepts from ubiquitous computing, such as user profiles, context awareness, trails management, and similarity analysis. It uses two different approaches for trails similarity analysis: resources patterns and categories patterns. In this work we performed an asymptotic analysis, identifying Vulcanus’ algorithm complexity. Furthermore we also propose improvements achieved by dynamic programming technique, so the ordinary case is improved by using a bottom-up approach. With that approach, many unnecessary comparisons can be skipped and now Vulcanus 2.0 is presented with improvements in its average case scenario.
Resumo:
Orthobunyaviruses are the largest genus within the Bunyaviridae family, with over 170 named viruses classified into 18 serogroups (Elliott and Blakqori, 2001; Plyusnin et al., 2012). Orthobunyaviruses are transmitted by arthropods and have a tripartite negative sense RNA genome, which encodes 4 structural proteins and 2 non-structural proteins. The non-structural protein NSs is the primary virulence factor of orthobunyaviruses and potent antagonist of the type I interferon (IFN) response. However, sequencing studies have identified pathogenic viruses that lack the NSs protein (Mohamed et al., 2009; Gauci et al., 2010). The work presented in this thesis describes the molecular and biological characterisation of divergent orthobunyaviruses. Data on plaque morphology, growth kinetics, protein profiles, sensitivity to IFN and activation of the type I IFN system are presented for viruses in the Anopheles A, Anopheles B, Capim, Gamboa, Guama, Minatitlan, Nyando, Tete and Turlock serogroups. These are complemented with complete genome sequencing and phylogenetic analysis. Low activation of IFN by Tete serogroup viruses, which naturally lack an NSs protein, was also further investigated by the development of a reverse genetics system for Batama virus (BMAV). Recombinant viruses with mutations in the virus nucleocapsid protein amino terminus showed higher activation of type I IFN in vitro and data suggests that low levels of IFN are due to lower activation rather than active antagonism. The anti-orthobunyavirus activity of IFN-stimulated genes IFI44, IFITMs and human and ovine BST2 were also studied, revealing that activity varies not only within the orthobunyavirus genus and virus serogroups but also within virus species. Furthermore, there was evidence of active antagonism of the type I IFN response and ISGs by non-NSs viruses. In summary, the results show that pathogenicity in man and antagonism of the type I IFN response in vitro cannot be predicted by the presence, or absence, of an NSs ORF. They also highlight problems in orthobunyavirus classification with discordance between classical antigen based data and phylogenetic analysis.
Resumo:
Over the last decade, success of social networks has significantly reshaped how people consume information. Recommendation of contents based on user profiles is well-received. However, as users become dominantly mobile, little is done to consider the impacts of the wireless environment, especially the capacity constraints and changing channel. In this dissertation, we investigate a centralized wireless content delivery system, aiming to optimize overall user experience given the capacity constraints of the wireless networks, by deciding what contents to deliver, when and how. We propose a scheduling framework that incorporates content-based reward and deliverability. Our approach utilizes the broadcast nature of wireless communication and social nature of content, by multicasting and precaching. Results indicate this novel joint optimization approach outperforms existing layered systems that separate recommendation and delivery, especially when the wireless network is operating at maximum capacity. Utilizing limited number of transmission modes, we significantly reduce the complexity of the optimization. We also introduce the design of a hybrid system to handle transmissions for both system recommended contents ('push') and active user requests ('pull'). Further, we extend the joint optimization framework to the wireless infrastructure with multiple base stations. The problem becomes much harder in that there are many more system configurations, including but not limited to power allocation and how resources are shared among the base stations ('out-of-band' in which base stations transmit with dedicated spectrum resources, thus no interference; and 'in-band' in which they share the spectrum and need to mitigate interference). We propose a scalable two-phase scheduling framework: 1) each base station obtains delivery decisions and resource allocation individually; 2) the system consolidates the decisions and allocations, reducing redundant transmissions. Additionally, if the social network applications could provide the predictions of how the social contents disseminate, the wireless networks could schedule the transmissions accordingly and significantly improve the dissemination performance by reducing the delivery delay. We propose a novel method utilizing: 1) hybrid systems to handle active disseminating requests; and 2) predictions of dissemination dynamics from the social network applications. This method could mitigate the performance degradation for content dissemination due to wireless delivery delay. Results indicate that our proposed system design is both efficient and easy to implement.
Resumo:
Seventh grade students share personal characteristics that are analyzed in this paper based on the teachers´ performance of the proper mediation for the students´ learning. The results of an investigation done with the model of teachers´ profiles at this level will be the basis for this paper (Alfaro et al, 2008a). We conclude that the biopsychosocial development and maturational stage of the transit-preteen-teen children must be the domain of faculty at this level, as well as, the socioeconomic context of family and social contexts of their students.
Resumo:
A recent integral-field spectroscopic (IFS) survey, the MASSIVE survey (Ma et al. 2014), observed the 116 most massive (MK < −25.3 mag, stellar mass M∗ > 10^11.6 M⊙) early-type galaxies (ETGs) within 108 Mpc, out to radii as large as 40 kpc, that correspond to ∼ 2 − 3 effective radii (Re). One of the major findings of the MASSIVE survey is that the galaxy sample is split nearly equally among three groups showing three different velocity dispersion profiles σ(R) outer of a radius ∼ 5 kpc (falling, flat and rising with radius). The purpose of this thesis is to model the kinematic profiles of six ETGs included in the MASSIVE survey and representative of the three observed σ(R) shapes, with the aim of investigating their dynamical structure. Models for the chosen galaxies are built using the numerical code JASMINE (Posacki, Pellegrini, and Ciotti 2013). The code produces models of axisymmetric galaxies, based on the solution of the Jeans equations for a multicomponent gravitational potential (supermassive black hole, stars and dark matter halo). With the aim of having a good agreement between the kinematics obtained from the Jeans equations, and the observed σ and rotation velocity V of MASSIVE (Veale et al. 2016, 2018), I derived constraints on the dark matter distribution and orbital anisotropy. This work suggests a trend of the dark matter amount and distribution with the shape of the velocity dispersion profiles in the outer regions: the models of galaxies with flat or rising velocity dispersion profiles show higher dark matter fractions fDM both within 1 Re and 5 Re. Orbital anisotropy alone cannot account for the different observed trends of σ(R) and has a minor effect compared to variations of the mass profile. Galaxies with similar stellar mass M∗ that show different velocity dispersion profiles (from falling to rising) are successfully modelled with a variation of the halo mass Mh.
Resumo:
Since the majority of the population of the world lives in cities and that this number is expected to increase in the next years, one of the biggest challenges of the research is the determination of the risk deriving from high temperatures experienced in urban areas, together with improving responses to climate-related disasters, for example by introducing in the urban context vegetation or built infrastructures that can improve the air quality. In this work, we will investigate how different setups of the boundary and initial conditions set on an urban canyon generate different patterns of the dispersion of a pollutant. To do so we will exploit the low computational cost of Reynolds-Averaged Navier-Stokes (RANS) simulations to reproduce the dynamics of an infinite array of two-dimensional square urban canyons. A pollutant is released at the street level to mimic the presence of traffic. RANS simulations are run using the k-ɛ closure model and vertical profiles of significant variables of the urban canyon, namely the velocity, the turbulent kinetic energy, and the concentration, are represented. This is done using the open-source software OpenFOAM and modifying the standard solver simpleFoam to include the concentration equation and the temperature by introducing a buoyancy term in the governing equations. The results of the simulation are validated with experimental results and products of Large-Eddy Simulations (LES) from previous works showing that the simulation is able to reproduce all the quantities under examination with satisfactory accuracy. Moreover, this comparison shows that despite LES are known to be more accurate albeit more expensive, RANS simulations represent a reliable tool if a smaller computational cost is needed. Overall, this work exploits the low computational cost of RANS simulations to produce multiple scenarios useful to evaluate how the dispersion of a pollutant changes by a modification of key variables, such as the temperature.
Resumo:
Semantic Web Mining aims at combining the two fast-developing research areas Semantic Web and Web Mining. This survey analyzes the convergence of trends from both areas: an increasing number of researchers is working on improving the results of Web Mining by exploiting semantic structures in the Web, and they make use of Web Mining techniques for building the Semantic Web. Last but not least, these techniques can be used for mining the Semantic Web itself. The Semantic Web is the second-generation WWW, enriched by machine-processable information which supports the user in his tasks. Given the enormous size even of today’s Web, it is impossible to manually enrich all of these resources. Therefore, automated schemes for learning the relevant information are increasingly being used. Web Mining aims at discovering insights about the meaning of Web resources and their usage. Given the primarily syntactical nature of the data being mined, the discovery of meaning is impossible based on these data only. Therefore, formalizations of the semantics of Web sites and navigation behavior are becoming more and more common. Furthermore, mining the Semantic Web itself is another upcoming application. We argue that the two areas Web Mining and Semantic Web need each other to fulfill their goals, but that the full potential of this convergence is not yet realized. This paper gives an overview of where the two areas meet today, and sketches ways of how a closer integration could be profitable.