756 resultados para Grid-based clustering approach
Resumo:
This thesis investigates how web search evaluation can be improved using historical interaction data. Modern search engines combine offline and online evaluation approaches in a sequence of steps that a tested change needs to pass through to be accepted as an improvement and subsequently deployed. We refer to such a sequence of steps as an evaluation pipeline. In this thesis, we consider the evaluation pipeline to contain three sequential steps: an offline evaluation step, an online evaluation scheduling step, and an online evaluation step. In this thesis we show that historical user interaction data can aid in improving the accuracy or efficiency of each of the steps of the web search evaluation pipeline. As a result of these improvements, the overall efficiency of the entire evaluation pipeline is increased. Firstly, we investigate how user interaction data can be used to build accurate offline evaluation methods for query auto-completion mechanisms. We propose a family of offline evaluation metrics for query auto-completion that represents the effort the user has to spend in order to submit their query. The parameters of our proposed metrics are trained against a set of user interactions recorded in the search engine’s query logs. From our experimental study, we observe that our proposed metrics are significantly more correlated with an online user satisfaction indicator than the metrics proposed in the existing literature. Hence, fewer changes will pass the offline evaluation step to be rejected after the online evaluation step. As a result, this would allow us to achieve a higher efficiency of the entire evaluation pipeline. Secondly, we state the problem of the optimised scheduling of online experiments. We tackle this problem by considering a greedy scheduler that prioritises the evaluation queue according to the predicted likelihood of success of a particular experiment. This predictor is trained on a set of online experiments, and uses a diverse set of features to represent an online experiment. Our study demonstrates that a higher number of successful experiments per unit of time can be achieved by deploying such a scheduler on the second step of the evaluation pipeline. Consequently, we argue that the efficiency of the evaluation pipeline can be increased. Next, to improve the efficiency of the online evaluation step, we propose the Generalised Team Draft interleaving framework. Generalised Team Draft considers both the interleaving policy (how often a particular combination of results is shown) and click scoring (how important each click is) as parameters in a data-driven optimisation of the interleaving sensitivity. Further, Generalised Team Draft is applicable beyond domains with a list-based representation of results, i.e. in domains with a grid-based representation, such as image search. Our study using datasets of interleaving experiments performed both in document and image search domains demonstrates that Generalised Team Draft achieves the highest sensitivity. A higher sensitivity indicates that the interleaving experiments can be deployed for a shorter period of time or use a smaller sample of users. Importantly, Generalised Team Draft optimises the interleaving parameters w.r.t. historical interaction data recorded in the interleaving experiments. Finally, we propose to apply the sequential testing methods to reduce the mean deployment time for the interleaving experiments. We adapt two sequential tests for the interleaving experimentation. We demonstrate that one can achieve a significant decrease in experiment duration by using such sequential testing methods. The highest efficiency is achieved by the sequential tests that adjust their stopping thresholds using historical interaction data recorded in diagnostic experiments. Our further experimental study demonstrates that cumulative gains in the online experimentation efficiency can be achieved by combining the interleaving sensitivity optimisation approaches, including Generalised Team Draft, and the sequential testing approaches. Overall, the central contributions of this thesis are the proposed approaches to improve the accuracy or efficiency of the steps of the evaluation pipeline: the offline evaluation frameworks for the query auto-completion, an approach for the optimised scheduling of online experiments, a general framework for the efficient online interleaving evaluation, and a sequential testing approach for the online search evaluation. The experiments in this thesis are based on massive real-life datasets obtained from Yandex, a leading commercial search engine. These experiments demonstrate the potential of the proposed approaches to improve the efficiency of the evaluation pipeline.
Resumo:
Macro and micro-economic perspectives are combined in an eco- nomic growth model. An agent-based modeling approach is used to develop an overlapping generation framework where endogenous growth is supported by work- ers that decide to study depending on their relative (skilled and unskilled) indi- vidual satisfaction. The micro perspective is based on individual satisfaction: an utility function computed from the variation of the relative income in both space and time. The macro perspective emerges from micro decisions, and, as in other growth models of this type, concerns an important allocative social decision the share of the working population that is engaged in producing ideas (skilled work- ers). Simulations show that production and satisfaction levels are higher when the evolution of income measured in both space and time are equally weighted.
Resumo:
The main objective for physics based modeling of the power converter components is to design the whole converter with respect to physical and operational constraints. Therefore, all the elements and components of the energy conversion system are modeled numerically and combined together to achieve the whole system behavioral model. Previously proposed high frequency (HF) models of power converters are based on circuit models that are only related to the parasitic inner parameters of the power devices and the connections between the components. This dissertation aims to obtain appropriate physics-based models for power conversion systems, which not only can represent the steady state behavior of the components, but also can predict their high frequency characteristics. The developed physics-based model would represent the physical device with a high level of accuracy in predicting its operating condition. The proposed physics-based model enables us to accurately develop components such as; effective EMI filters, switching algorithms and circuit topologies [7]. One of the applications of the developed modeling technique is design of new sets of topologies for high-frequency, high efficiency converters for variable speed drives. The main advantage of the modeling method, presented in this dissertation, is the practical design of an inverter for high power applications with the ability to overcome the blocking voltage limitations of available power semiconductor devices. Another advantage is selection of the best matching topology with inherent reduction of switching losses which can be utilized to improve the overall efficiency. The physics-based modeling approach, in this dissertation, makes it possible to design any power electronic conversion system to meet electromagnetic standards and design constraints. This includes physical characteristics such as; decreasing the size and weight of the package, optimized interactions with the neighboring components and higher power density. In addition, the electromagnetic behaviors and signatures can be evaluated including the study of conducted and radiated EMI interactions in addition to the design of attenuation measures and enclosures.
Resumo:
Cities are small-scale complex socio-ecological systems, that host around 60% of world population. Ecosystem Services (ES) provided by urban ecosystems offer multiple benefits necessary to cope with present and future urban challenges. These ES include microclimate regulation, runoff control, as well as opportunities for mental and physical recreation, affecting citizen’s health and wellbeing. Creating a balance between urban development, land take containment, climate adaptation and availability of Urban Green Areas and their related benefits, can improve the quality of the lives of the inhabitants, the economic performance of the city and the social justice and cohesion aspects. This work starts analysing current literature around the topic of Ecosystem Services (ES), Green and Blue Infrastructure (GBI) and Nature-based Solutions (NBS) and their integration within current European and International sustainability policies. Then, the thesis focuses on the role of ES, GBI and NBS towards urban sustainability and resilience setting the basis to build the core methodological and conceptual approach of this work. The developed ES-based conceptual approach provides guidance on how to map and assess ES, to better inform policy making and to give the proper value to ES within urban context. The proposed interdisciplinary approach navigates the topic of mapping and assessing ES benefits in terms of regulatory services, with a focus on climate mitigation and adaptation, and cultural services, to enhance wellbeing and justice in urban areas. Last, this thesis proposes a trans-disciplinary and participatory approach to build resilience over time around all relevant urban ES. The two case studies that will be presented in this dissertation, the city of Bologna and the city of Barcelona, have been used to implement, tailor and test the proposed conceptual framework, raising valuable inputs for planning, policies and science.
Resumo:
Context. Cluster properties can be more distinctly studied in pairs of clusters, where we expect the effects of interactions to be strong. Aims. We here discuss the properties of the double cluster Abell 1758 at a redshift z similar to 0.279. These clusters show strong evidence for merging. Methods. We analyse the optical properties of the North and South cluster of Abell 1758 based on deep imaging obtained with the Canada-France-Hawaii Telescope (CFHT) archive Megaprime/Megacam camera in the g' and r' bands, covering a total region of about 1.05 x 1.16 deg(2), or 16.1 x 17.6 Mpc(2). Our X-ray analysis is based on archive XMM-Newton images. Numerical simulations were performed using an N-body algorithm to treat the dark-matter component, a semi-analytical galaxy-formation model for the evolution of the galaxies and a grid-based hydrodynamic code with a parts per million (PPM) scheme for the dynamics of the intra-cluster medium. We computed galaxy luminosity functions (GLFs) and 2D temperature and metallicity maps of the X-ray gas, which we then compared to the results of our numerical simulations. Results. The GLFs of Abell 1758 North are well fit by Schechter functions in the g' and r' bands, but with a small excess of bright galaxies, particularly in the r' band; their faint-end slopes are similar in both bands. In contrast, the GLFs of Abell 1758 South are not well fit by Schechter functions: excesses of bright galaxies are seen in both bands; the faint-end of the GLF is not very well defined in g'. The GLF computed from our numerical simulations assuming a halo mass-luminosity relation agrees with those derived from the observations. From the X-ray analysis, the most striking features are structures in the metal distribution. We found two elongated regions of high metallicity in Abell 1758 North with two peaks towards the centre. In contrast, Abell 1758 South shows a deficit of metals in its central regions. Comparing observational results to those derived from numerical simulations, we could mimic the most prominent features present in the metallicity map and propose an explanation for the dynamical history of the cluster. We found in particular that in the metal-rich elongated regions of the North cluster, winds had been more efficient than ram-pressure stripping in transporting metal-enriched gas to the outskirts. Conclusions. We confirm the merging structure of the North and South clusters, both at optical and X-ray wavelengths.
Resumo:
This paper develops an interactive approach for exploratory spatial data analysis. Measures of attribute similarity and spatial proximity are combined in a clustering model to support the identification of patterns in spatial information. Relationships between the developed clustering approach, spatial data mining and choropleth display are discussed. Analysis of property crime rates in Brisbane, Australia is presented. A surprising finding in this research is that there are substantial inconsistencies in standard choropleth display options found in two widely used commercial geographical information systems, both in terms of definition and performance. The comparative results demonstrate the usefulness and appeal of the developed approach in a geographical information system environment for exploratory spatial data analysis.
Resumo:
Urine is an ideal source of materials to search for potential disease-related biomarkers as it is produced by the affected tissues and can be easily obtained by noninvasive methods. 2-DE-based proteomic approach was used to better understand the molecular mechanisms of injury induced by fluoride (F(-)) and define potential biomarkers of dental fluorosis. Three groups of weanling male Wistar rats were treated with drinking water containing 0 (control), 5, or 50 ppm F(-) for 60 days (n = 15/group). During the experimental period, the animals were kept individually in metabolic cages, to analyze the water and food consumption, as well as fecal and urinary F excretion. Urinary proteome profiles were examined using 2-DE and Colloidal Coomassie Brilliant Blue staining. A dose-response regarding F(-) intake and excretion was detected. Quantitative intensity analysis revealed 8, 11, and 8 significantly altered proteins between control vs. 5 ppm F(-), control vs. 50 ppm F(-) and 5 ppm F(-) vs. 50 ppm F(-) groups, respectively. Two proteins regulated by androgens (androgen-regulated 20-KDa protein and 0c-2,1-globulin) and one related to detoxification (aflatoxin-Bl-aldehyde-reductase) were identified by MALDI-TOF-TOF MS/MS. Thus, proteomic analysis can help to better understand the mechanisms underlying F(-) toxicity, even in low doses. 2010 Wiley Periodicals, Inc. J Biochem Mol Toxicol 25:8-14, 2011; View this article online at wileyonlinelibrary.com. DOI 10:1002/jbt.20353
Resumo:
Abstract: The Murray-Darling Basin comprises over 1 million km2; it lies within four states and one territory; and over 12, 800 GL of irrigation water is used to produce over 40% of the nation's gross value of agricultural production. This production is used by a diverse collection of some-times mutually exclusive commodities (e.g. pasture; stone fruit; grapes; cotton and field crops). The supply of water for irrigation is subject to climatic and policy uncertainty. Variable inflows mean that water property rights do not provide a guaranteed supply. With increasing public scrutiny and environmental issues facing irrigators, greater pressure is being placed on this finite resource. The uncertainty of the water supply, water quality (salinity), combined with where water is utilised, while attempting to maximising return for investment makes for an interesting research field. The utilisation and comparison of a GAMS and Excel based modelling approach has been used to ask: where should we allocate water?; amongst what commodities?; and how does this affect both the quantity of water and the quality of water along the Murray-Darling river system?
Resumo:
Time-dependent wavepacket evolution techniques demand the action of the propagator, exp(-iHt/(h)over-bar), on a suitable initial wavepacket. When a complex absorbing potential is added to the Hamiltonian for combating unwanted reflection effects, polynomial expansions of the propagator are selected on their ability to cope with non-Hermiticity. An efficient subspace implementation of the Newton polynomial expansion scheme that requires fewer dense matrix-vector multiplications than its grid-based counterpart has been devised. Performance improvements are illustrated with some benchmark one and two-dimensional examples. (C) 2001 Elsevier Science B.V. All rights reserved.
Resumo:
The outcome of dendritic cell (DC) presentation of Ag to T cells via the TCR/MHC synapse is determined by second signaling through CD80/86 and, importantly, by ligation of costimulatory ligands and receptors located at the DC and T cell surfaces. Downstream signaling triggered by costimulatory molecule ligation results in reciprocal DC and T cell activation and survival, which predisposes to enhanced T cell-mediated immune responses. In this study, we used adenoviral vectors to express a model tumor Ag (the E7 oncoprotein of human papillomavirus 16) with or without coexpression of receptor activator of NF-kappaB (RANK)/RANK ligand (RANKL) or CD40/CD40L costimulatory molecules, and used these transgenic DCs to immunize mice for the generation of E7-directed CD8(+) T cell responses. We show that coexpression of RANK/RANKL, but not CD40/CD40L, in E7-expressing DCs augmented E7-specific IFN-gamma-secreting effector and memory T cells and E7-specific CTLs. These responses were also augmented by coexpression of T cell costimulatory molecules (RANKL and CD40L) or DC costimulatory molecules (RANK and CD40) in the E7-expressing DC immunogens. Augmentation of CTL responses correlated with up-regulation of CD80 and CD86 expression in DCs transduced with costimulatory molecules, suggesting a mechanism for enhanced T cell activation/survival. These results have generic implications for improved tumor Ag-expressing DC vaccines, and specific implications for a DC-based vaccine approach for human papillomavirus 16-associated cervical carcinoma.
Resumo:
A gest??o por compet??ncias tem sido apontada como alternativa aos modelos gerenciais tradicionalmente utilizados pelas organiza????es. Prop??e-se a orientar esfor??os para planejar, captar, desenvolver e avaliar, nos diferentes n??veis da organiza????o, as compet??ncias necess??rias ?? consecu????o de seus objetivos. Uma das principais etapas desse processo constitui o denominado mapeamento de compet??ncias. Este artigo objetiva apresentar m??todos, t??cnicas e instrumentos utilizados para mapeamento de compet??ncias em organiza????es p??blicas e privadas. Para isso, fazem-se uma revis??o da literatura sobre o conceito de compet??ncia, o mapeamento de compet??ncias e a gest??o por compet??ncias, discutindo-se seus pressupostos e suas aplica????es. Ao final, s??o levantadas as implica????es desse modelo de gest??o para o setor p??blico e s??o apresentadas recomenda????es pr??ticas.
Resumo:
O presente projecto tem como objectivo a disponibilização de uma plataforma de serviços para gestão e contabilização de tempo remunerável, através da marcação de horas de trabalho, férias e faltas (com ou sem justificação). Pretende-se a disponibilização de relatórios com base nesta informação e a possibilidade de análise automática dos dados, como por exemplo excesso de faltas e férias sobrepostas de trabalhadores. A ênfase do projecto está na disponibilização de uma arquitectura que facilite a inclusão destas funcionalidades. O projecto está implementado sobre a plataforma Google App Engine (i.e. GAE), de forma a disponibilizar uma solução sob o paradigma de Software as a Service, com garantia de disponibilidade e replicação de dados. A plataforma foi escolhida a partir da análise das principais plataformas cloud existentes: Google App Engine, Windows Azure e Amazon Web Services. Foram analisadas as características de cada plataforma, nomeadamente os modelos de programação, os modelos de dados disponibilizados, os serviços existentes e respectivos custos. A escolha da plataforma foi realizada com base nas suas características à data de iniciação do presente projecto. A solução está estruturada em camadas, com as seguintes componentes: interface da plataforma, lógica de negócio e lógica de acesso a dados. A interface disponibilizada está concebida com observação dos princípios arquitecturais REST, suportando dados nos formatos JSON e XML. A esta arquitectura base foi acrescentada uma componente de autorização, suportada em Spring-Security, sendo a autenticação delegada para os serviços Google Acounts. De forma a permitir o desacoplamento entre as várias camadas foi utilizado o padrão Dependency Injection. A utilização deste padrão reduz a dependência das tecnologias utilizadas nas diversas camadas. Foi implementado um protótipo, para a demonstração do trabalho realizado, que permite interagir com as funcionalidades do serviço implementadas, via pedidos AJAX. Neste protótipo tirou-se partido de várias bibliotecas javascript e padrões que simplificaram a sua realização, tal como o model-view-viewmodel através de data binding. Para dar suporte ao desenvolvimento do projecto foi adoptada uma abordagem de desenvolvimento ágil, baseada em Scrum, de forma a implementar os requisitos do sistema, expressos em user stories. De forma a garantir a qualidade da implementação do serviço foram realizados testes unitários, sendo também feita previamente a análise da funcionalidade e posteriormente produzida a documentação recorrendo a diagramas UML.
Resumo:
Com a crescente geração, armazenamento e disseminação da informação nos últimos anos, o anterior problema de falta de informação transformou-se num problema de extracção do conhecimento útil a partir da informação disponível. As representações visuais da informação abstracta têm sido utilizadas para auxiliar a interpretação os dados e para revelar padrões de outra forma escondidos. A visualização de informação procura aumentar a cognição humana aproveitando as capacidades visuais humanas, de forma a tornar perceptível a informação abstracta, fornecendo os meios necessários para que um humano possa absorver quantidades crescentes de informação, com as suas capacidades de percepção. O objectivo das técnicas de agrupamento de dados consiste na divisão de um conjunto de dados em vários grupos, em que dados semelhantes são colocados no mesmo grupo e dados dissemelhantes em grupos diferentes. Mais especificamente, o agrupamento de dados com restrições tem o intuito de incorporar conhecimento a priori no processo de agrupamento de dados, com o objectivo de aumentar a qualidade do agrupamento de dados e, simultaneamente, encontrar soluções apropriadas a tarefas e interesses específicos. Nesta dissertação é estudado a abordagem de Agrupamento de Dados Visual Interactivo que permite ao utilizador, através da interacção com uma representação visual da informação, incorporar o seu conhecimento prévio acerca do domínio de dados, de forma a influenciar o agrupamento resultante para satisfazer os seus objectivos. Esta abordagem combina e estende técnicas de visualização interactiva de informação, desenho de grafos de forças direccionadas e agrupamento de dados com restrições. Com o propósito de avaliar o desempenho de diferentes estratégias de interacção com o utilizador, são efectuados estudos comparativos utilizando conjuntos de dados sintéticos e reais.
Resumo:
Background Information:The incorporation of distance learning activities by institutions of higher education is considered an important contribution to create new opportunities for teaching at both, initial and continuing training. In Medicine and Nursing, several papers illustrate the adaptation of technological components and teaching methods are prolific, however, when we look at the Pharmaceutical Education area, the examples are scarce. In that sense this project demonstrates the implementation and assessment of a B-Learning Strategy for Therapeutics using a “case based learning” approach. Setting: Academic Pharmacy Methods:This is an exploratory study involving 2nd year students of the Pharmacy Degree at the School of Allied Health Sciences of Oporto. The study population consists of 61 students, divided in groups of 3-4 elements. The b-learning model was implemented during a time period of 8 weeks. Results:A B-learning environment and digital learning objects were successfully created and implemented. Collaboration and assessment techniques were carefully developed to ensure the active participation and fair assessment of all students. Moodle records show a consistent activity of students during the assignments. E-portfolios were also developed using Wikispaces, which promoted reflective writing and clinical reasoning. Conclusions:Our exploratory study suggests that the “case based learning” method can be successfully combined with the technological components to create and maintain a feasible online learning environment for the teaching of therapeutics.
Resumo:
The influence of uncertainties of input parameters on output response of composite structures is investigated in this paper. In particular, the effects of deviations in mechanical properties, ply angles, ply thickness and on applied loads are studied. The uncertainty propagation and the importance measure of input parameters are analysed using three different approaches: a first-order local method, a Global Sensitivity Analysis (GSA) supported by a variance-based method and an extension of local variance to estimate the global variance over the domain of inputs. Sample results are shown for a shell composite laminated structure built with different composite systems including multi-materials. The importance measures of input parameters on structural response based on numerical results are established and discussed as a function of the anisotropy of composite materials. Needs for global variance methods are discussed by comparing the results obtained from different proposed methodologies. The objective of this paper is to contribute for the use of GSA techniques together with low expensive local importance measures.