881 resultados para data generation
Resumo:
Personalized recommender systems aim to assist users in retrieving and accessing interesting items by automatically acquiring user preferences from the historical data and matching items with the preferences. In the last decade, recommendation services have gained great attention due to the problem of information overload. However, despite recent advances of personalization techniques, several critical issues in modern recommender systems have not been well studied. These issues include: (1) understanding the accessing patterns of users (i.e., how to effectively model users' accessing behaviors); (2) understanding the relations between users and other objects (i.e., how to comprehensively assess the complex correlations between users and entities in recommender systems); and (3) understanding the interest change of users (i.e., how to adaptively capture users' preference drift over time). To meet the needs of users in modern recommender systems, it is imperative to provide solutions to address the aforementioned issues and apply the solutions to real-world applications. ^ The major goal of this dissertation is to provide integrated recommendation approaches to tackle the challenges of the current generation of recommender systems. In particular, three user-oriented aspects of recommendation techniques were studied, including understanding accessing patterns, understanding complex relations and understanding temporal dynamics. To this end, we made three research contributions. First, we presented various personalized user profiling algorithms to capture click behaviors of users from both coarse- and fine-grained granularities; second, we proposed graph-based recommendation models to describe the complex correlations in a recommender system; third, we studied temporal recommendation approaches in order to capture the preference changes of users, by considering both long-term and short-term user profiles. In addition, a versatile recommendation framework was proposed, in which the proposed recommendation techniques were seamlessly integrated. Different evaluation criteria were implemented in this framework for evaluating recommendation techniques in real-world recommendation applications. ^ In summary, the frequent changes of user interests and item repository lead to a series of user-centric challenges that are not well addressed in the current generation of recommender systems. My work proposed reasonable solutions to these challenges and provided insights on how to address these challenges using a simple yet effective recommendation framework.^
Resumo:
Recently, wireless network technology has grown at such a pace that scientific research has become a practical reality in a very short time span. One mobile system that features high data rates and open network architecture is 4G. Currently, the research community and industry, in the field of wireless networks, are working on possible choices for solutions in the 4G system. The researcher considers one of the most important characteristics of future 4G mobile systems the ability to guarantee reliable communications at high data rates, in addition to high efficiency in the spectrum usage. On mobile wireless communication networks, one important factor is the coverage of large geographical areas. In 4G systems, a hybrid satellite/terrestrial network is crucial to providing users with coverage wherever needed. Subscribers thus require a reliable satellite link to access their services when they are in remote locations where a terrestrial infrastructure is unavailable. The results show that good modulation and access technique are also required in order to transmit high data rates over satellite links to mobile users. The dissertation proposes the use of OFDM (Orthogonal Frequency Multiple Access) for the satellite link by increasing the time diversity. This technique will allow for an increase of the data rate, as primarily required by multimedia applications, and will also optimally use the available bandwidth. In addition, this dissertation approaches the use of Cooperative Satellite Communications for hybrid satellite/terrestrial networks. By using this technique, the satellite coverage can be extended to areas where there is no direct link to the satellite. The issue of Cooperative Satellite Communications is solved through a new algorithm that forwards the received data from the fixed node to the mobile node. This algorithm is very efficient because it does not allow unnecessary transmissions and is based on signal to noise ratio (SNR) measures.
Resumo:
Next-generation sequencing (NGS) technologies have enabled us to determine phytoplankton community compositions at high resolution. However, few studies have adopted this approach to assess the responses of natural phytoplankton communities to environmental change. Here, we report the impact of different CO2 levels on spring diatoms in the Oyashio region of the western North Pacific as estimated by NGS of the diatom-specific rbcL gene (DNA), which encodes the large subunit of RubisCO. We also examined the abundance and composition of rbcL transcripts (cDNA) in diatoms to assess their physiological responses to changing CO2 levels. A short-term (3-day) incubation experiment was carried out on-deck using surface Oyashio waters under different pCO2 levels (180, 350, 750, and 1000 µatm) in May 2011. During the incubation, the transcript abundance of the diatom-specific rbcL gene decreased with an increase in seawater pCO2 levels. These results suggest that CO2 fixation capacity of diatoms decreased rapidly under elevated CO2 levels. In the high CO2 treatments (750 and 1000 µatm), diversity of diatom-specific rbcL gene and its transcripts decreased relative to the control treatment (350µatm), as well as contributions of Chaetocerataceae, Thalassiosiraceae, and Fragilariaceae to the total population, but the contributions of Bacillariaceae increased. In the low CO2 treatment, contributions of Bacillariaceae also increased together with other eukaryotes. These suggest that changes in CO2 levels can alter the community composition of spring diatoms in the Oyashio region. Overall, the NGS technology provided us a deeper understanding of the response of diatoms to changes in CO2 levels in terms of their community composition, diversity, and photosynthetic physiology.
Resumo:
The main focus of this research is to design and develop a high performance linear actuator based on a four bar mechanism. The present work includes the detailed analysis (kinematics and dynamics), design, implementation and experimental validation of the newly designed actuator. High performance is characterized by the acceleration of the actuator end effector. The principle of the newly designed actuator is to network the four bar rhombus configuration (where some bars are extended to form an X shape) to attain high acceleration. Firstly, a detailed kinematic analysis of the actuator is presented and kinematic performance is evaluated through MATLAB simulations. A dynamic equation of the actuator is achieved by using the Lagrangian dynamic formulation. A SIMULINK control model of the actuator is developed using the dynamic equation. In addition, Bond Graph methodology is presented for the dynamic simulation. The Bond Graph model comprises individual component modeling of the actuator along with control. Required torque was simulated using the Bond Graph model. Results indicate that, high acceleration (around 20g) can be achieved with modest (3 N-m or less) torque input. A practical prototype of the actuator is designed using SOLIDWORKS and then produced to verify the proof of concept. The design goal was to achieve the peak acceleration of more than 10g at the middle point of the travel length, when the end effector travels the stroke length (around 1 m). The actuator is primarily designed to operate in standalone condition and later to use it in the 3RPR parallel robot. A DC motor is used to operate the actuator. A quadrature encoder is attached with the DC motor to control the end effector. The associated control scheme of the actuator is analyzed and integrated with the physical prototype. From standalone experimentation of the actuator, around 17g acceleration was achieved by the end effector (stroke length was 0.2m to 0.78m). Results indicate that the developed dynamic model results are in good agreement. Finally, a Design of Experiment (DOE) based statistical approach is also introduced to identify the parametric combination that yields the greatest performance. Data are collected by using the Bond Graph model. This approach is helpful in designing the actuator without much complexity.
Resumo:
Article Accepted Date: 29 May 2014 Acknowledgements The authors gratefully acknowledge the support of the Cognitive Science Society for the organisation of the Workshop on Production of Referring Expressions: Bridging the Gap between Cognitive and Computational Approaches to Reference, from which this special issue originated. Funding Emiel Krahmer and Albert Gatt thank The Netherlands Organisation for Scientific Research (NWO) for VICI grant Bridging the Gap between Computational Linguistics and Psycholinguistics: The Case of Referring Expressions (grant number 277-70-007).
Resumo:
The principal aim of this paper is to examine the criteria assisting in the selection of biomass for energy generation in Brazil. To reach the aim, this paper adopts case study and survey research methods to collect information from four biomass energy case companies and solicits opinions from experts. The data gathered are analysed in line with a wide range of related data, including selection criteria for biomass and its importance, energy policies in Brazil, availability of biomass feedstock in Brazil and its characteristics, as well as status quo of biomass-based energy in Brazil. The findings of the paper demonstrate that there are ten main criteria in biomass selection for energy generation in Brazil. They comprise geographical conditions, availability of biomass feedstock, demand satisfaction, feedstock costs and oil prices, energy content of biomass feedstock, business and economic growth, CO2 emissions of biomass end-products, effects on soil, water and biodiversity, job creation and local community support, as well as conversion technologies. Furthermore, the research also found that these main criteria cannot be grouped on the basis of sustainability criteria, nor ranked by their importance as there is correlation between each criterion such as a cause and effect relationship, as well as some overlapping areas. Consequently, this means that when selecting biomass more comprehensive consideration is advisable.
Resumo:
This paper analyses the impact of stimulating staff creativity and idea generation on the likelihood of innovation. Using data for over 3,000 firms, obtained from the Irish Community Innovation Survey 2008-10, we examine the impact of six creativity generating stimuli on product, process, organisational, and marketing innovation. Our results indicate that the stimuli impact the four forms of innovation in different ways. For instance brainstorming and multidisciplinary teams are found to stimulate all forms of innovation, rotation of employees is found to stimulate organisational innovation, while financial and non-financial incentives are found to have no effect on any form of innovation. We also find that the co-introduction of two or more stimuli increases the likelihood of innovation more than implementing stimuli in isolation. These results have important implications for management decisions in that they suggest that firms should target their creative efforts towards specific innovation outcomes.
Resumo:
As the Web evolves unexpectedly fast, information grows explosively. Useful resources become more and more difficult to find because of their dynamic and unstructured characteristics. A vertical search engine is designed and implemented towards a specific domain. Instead of processing the giant volume of miscellaneous information distributed in the Web, a vertical search engine targets at identifying relevant information in specific domains or topics and eventually provides users with up-to-date information, highly focused insights and actionable knowledge representation. As the mobile device gets more popular, the nature of the search is changing. So, acquiring information on a mobile device poses unique requirements on traditional search engines, which will potentially change every feature they used to have. To summarize, users are strongly expecting search engines that can satisfy their individual information needs, adapt their current situation, and present highly personalized search results. In my research, the next generation vertical search engine means to utilize and enrich existing domain information to close the loop of vertical search engine's system that mutually facilitate knowledge discovering, actionable information extraction, and user interests modeling and recommendation. I investigate three problems in which domain taxonomy plays an important role, including taxonomy generation using a vertical search engine, actionable information extraction based on domain taxonomy, and the use of ensemble taxonomy to catch user's interests. As the fundamental theory, ultra-metric, dendrogram, and hierarchical clustering are intensively discussed. Methods on taxonomy generation using my research on hierarchical clustering are developed. The related vertical search engine techniques are practically used in Disaster Management Domain. Especially, three disaster information management systems are developed and represented as real use cases of my research work.
Resumo:
Halo white dwarfs remain one of the least studied stellar populations in the Milky Way because of their faint luminosities. Recent work has uncovered a population of hot white dwarfs which are thought to be remnants of low-mass Population II stars. This thesis uses optical data from the Next Generation Virgo Cluster Survey (NGVS) and ultravoilet data from the GALEX Ultraviolet Virgo Cluster Survey (GUViCS) to select candidates which may belong to this population of recently formed halo white dwarfs. A colour selection was used to separate white dwarfs from QSOs and main-sequence stars. Photometric distances are calculated using model colour-absolute magnitude relations. Proper motions are calculated by using the difference in positions between objects from the Sloan Digital Sky Survey and the NGVS. The proper motions are combined with the calculated photometric distances to calculate tangential velocities, as well as approximate Galactic space velocities. White dwarf candidates are characterized as belonging to either the disk or the halo using a variety of methods, including calculated scale heights (z> 1 kpc), tangential velocities (vt >200 km/s), and their location in (V,U) space. The 20 halo white dwarf candidates which were selected using Galactic space velocities are analyzed, and their colours and temperatures suggest that these objects represent some of the youngest white dwarfs in the Galactic halo.
Resumo:
Bioscience subjects require a significant amount of training in laboratory techniques to produce highly skilled science graduates. Many techniques which are currently used in diagnostic, research and industrial laboratories require expensive equipment for single users; examples of which include next generation sequencing, quantitative PCR, mass spectrometry and other analytical techniques. The cost of the machines, reagents and limited access frequently preclude undergraduate students from using such cutting edge techniques. In addition to cost and availability, the time taken for analytical runs on equipment such as High Performance Liquid Chromatography (HPLC) does not necessarily fit with the limitations of timetabling. Understanding the theory underlying these techniques without the accompanying practical classes can be unexciting for students. One alternative from wet laboratory provision is to use virtual simulations of such practical which enable students to see the machines and interact with them to generate data. The Faculty of Science and Technology at the University of Westminster has provided all second and third year undergraduate students with iPads so that these students all have access to a mobile device to assist with learning. We have purchased licences from Labster to access a range of virtual laboratory simulations. These virtual laboratories are fully equipped and require student responses to multiple answer questions in order to progress through the experiment. In a pilot study to look at the feasibility of the Labster virtual laboratory simulations with the iPad devices; second year Biological Science students (n=36) worked through the Labster HPLC simulation on iPads. The virtual HPLC simulation enabled students to optimise the conditions for the separation of drugs. Answers to Multiple choice questions were necessary to progress through the simulation, these focussed on the underlying principles of the HPLC technique. Following the virtual laboratory simulation students went to a real HPLC in the analytical suite in order to separate of asprin, caffeine and paracetamol. In a survey 100% of students (n=36) in this cohort agreed that the Labster virtual simulation had helped them to understand HPLC. In free text responses one student commented that "The terminology is very clear and I enjoyed using Labster very much”. One member of staff commented that “there was a very good knowledge interaction with the virtual practical”.
Resumo:
Multi-frequency eddy current measurements are employed in estimating pressure tube (PT) to calandria tube (CT) gap in CANDU fuel channels, a critical inspection activity required to ensure fitness for service of fuel channels. In this thesis, a comprehensive characterization of eddy current gap data is laid out, in order to extract further information on fuel channel condition, and to identify generalized applications for multi-frequency eddy current data. A surface profiling technique, generalizable to multiple probe and conductive material configurations has been developed. This technique has allowed for identification of various pressure tube artefacts, has been independently validated (using ultrasonic measurements), and has been deployed and commissioned at Ontario Power Generation. Dodd and Deeds solutions to the electromagnetic boundary value problem associated with the PT to CT gap probe configuration were experimentally validated for amplitude response to changes in gap. Using the validated Dodd and Deeds solutions, principal components analysis (PCA) has been employed to identify independence and redundancies in multi-frequency eddy current data. This has allowed for an enhanced visualization of factors affecting gap measurement. Results of the PCA of simulation data are consistent with the skin depth equation, and are validated against PCA of physical experiments. Finally, compressed data acquisition has been realized, allowing faster data acquisition for multi-frequency eddy current systems with hardware limitations, and is generalizable to other applications where real time acquisition of large data sets is prohibitive.
Resumo:
Bitumen extraction from surface-mined oil sands results in the production of large volumes of Fluid Fine Tailings (FFT). Through Directive 085, the Province of Alberta has signaled that oil sands operators must improve and accelerate the methods by which they deal with FFT production, storage and treatment. This thesis aims to develop an enhanced method to forecast FFT production based on specific ore characteristics. A mass relationship and mathematical model to modify the Forecasting Tailings Model (FTM) by using fines and clay boundaries, as the two main indicators in FFT accumulation, has been developed. The modified FTM has been applied on representative block model data from an operating oil sands mining venture. An attempt has been made to identify order-of-magnitude associated tailings treatment costs, and to improve financial performance by not processing materials that have ultimate ore processing and tailings storage and treatment costs in excess of the value of bitumen they produce. The results on the real case study show that there is a 53% reduction in total tailings accumulations over the mine life by selectively processing only lower tailings generating materials through eliminating 15% of total mined ore materials with higher potential of fluid fines inventory. This significant result will assess the impact of Directive 082 on mining project economic and environmental performance towards the sustainable development of mining projects.
Resumo:
Advocates of Big Data assert that we are in the midst of an epistemological revolution, promising the displacement of the modernist methodological hegemony of causal analysis and theory generation. It is alleged that the growing ‘deluge’ of digitally generated data, and the development of computational algorithms to analyse them, has enabled new inductive ways of accessing everyday relational interactions through their ‘datafication’. This paper critically engages with these discourses of Big Data and complexity, particularly as they operate in the discipline of International Relations, where it is alleged that Big Data approaches have the potential for developing self-governing societal capacities for resilience and adaptation through the real-time reflexive awareness and management of risks and problems as they arise. The epistemological and ontological assumptions underpinning Big Data are then analysed to suggest that critical and posthumanist approaches have come of age through these discourses, enabling process-based and relational understandings to be translated into policy and governance practices. The paper thus raises some questions for the development of critical approaches to new posthuman forms of governance and knowledge production.
Resumo:
This research paper presents a five step algorithm to generate tool paths for machining Free form / Irregular Contoured Surface(s) (FICS) by adopting STEP-NC (AP-238) format. In the first step, a parametrized CAD model with FICS is created or imported in UG-NX6.0 CAD package. The second step recognizes the features and calculates a Closeness Index (CI) by comparing them with the B-Splines / Bezier surfaces. The third step utilizes the CI and extracts the necessary data to formulate the blending functions for identified features. In the fourth step Z-level 5 axis tool paths are generated by adopting flat and ball end mill cutters. Finally, in the fifth step, tool paths are integrated with STEP-NC format and validated. All these steps are discussed and explained through a validated industrial component.
Resumo:
Here, we describe gene expression compositional assignment (GECA), a powerful, yet simple method based on compositional statistics that can validate the transfer of prior knowledge, such as gene lists, into independent data sets, platforms and technologies. Transcriptional profiling has been used to derive gene lists that stratify patients into prognostic molecular subgroups and assess biomarker performance in the pre-clinical setting. Archived public data sets are an invaluable resource for subsequent in silico validation, though their use can lead to data integration issues. We show that GECA can be used without the need for normalising expression levels between data sets and can outperform rank-based correlation methods. To validate GECA, we demonstrate its success in the cross-platform transfer of gene lists in different domains including: bladder cancer staging, tumour site of origin and mislabelled cell lines. We also show its effectiveness in transferring an epithelial ovarian cancer prognostic gene signature across technologies, from a microarray to a next-generation sequencing setting. In a final case study, we predict the tumour site of origin and histopathology of epithelial ovarian cancer cell lines. In particular, we identify and validate the commonly-used cell line OVCAR-5 as non-ovarian, being gastrointestinal in origin. GECA is available as an open-source R package.