25 resultados para SHORTCUTS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We investigate the particle production in a toroidally compactified space-time due to the expansion of a Friedmann cosmological model in ℝ3 × S1 outside a U(1) local cosmic string. The case of a Friedmann space-time is also investigated where torsion is incorporated in the connection. We present a generalization to toroidal compactification of p extra dimensions, where the topology is given by ℝ3 × Tp. Some implications are presented and discussed. Besides the dynamics of space-time, we investigate in detail the physical consequences of the topological transformations. © World Scientific Publishing Company.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

123 p.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work we consider the evolution of a massive scalar field in cylindrically symmetric space-times. Quasinormal modes have been calculated for static and rotating cosmic cylinders. We found unstable modes in some cases. Rotating as well as static cosmic strings, i.e., without regular interior solutions, do not display quasinormal oscillation modes. We conclude that rotating cosmic cylinder space-times that present closed timelike curves are unstable against scalar perturbations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O presente relatório desenvolvido no âmbito do estágio curricular realizado na empresa – ASL & Associados, com duração de seis meses, tem como objetivo a elaboração e desenvolvimento de estudos acústicos de edifícios e elaboração de certificados energéticos de edifícios existentes. Inicialmente, será feita uma descrição da empresa acolhedora expondo a sua missão, valores e o portfólio abarcando projetos e revisão de projetos, certificação energética, ensaios acústicos, gestão e fiscalização de empreendimentos. As atividades desenvolvidas no estágio englobam a certificação energética de edifícios existentes aplicando a nova regulamentação térmica (REH), apresentando as simplificações de cálculo aplicadas a edifícios existentes e dois exemplos específicos de uma fração autónoma e um edifício unifamiliar. No campo da acústica foram estudados três edifícios, um habitacional, um misto e um de serviços. Para o condicionamento acústico dos casos de estudo serão aplicados métodos de cálculo utilizados na empresa e um método de cálculo para avaliação e classificação da qualidade acústica, o qual foi desenvolvido pelo Laboratório Nacional de Engenharia Civil – LNEC. Por fim, foram elaborados as ponderações sobre a experiência vivida, como considerações sobre os resultados obtidos tendo em conta os objetivos propostos para o desenvolvimento do estágio.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We describe methods for the fast production of highly coherent-spin-squeezed many-body states in bosonic Josephson junctions. We start from the known mapping of the two-site Bose-Hubbard (BH) Hamiltonian to that of a single effective particle evolving according to a Schrödinger-like equation in Fock space. Since, for repulsive interactions, the effective potential in Fock space is nearly parabolic, we extend recently derived protocols for shortcuts to adiabatic evolution in harmonic potentials to the many-body BH Hamiltonian. A comparison with current experiments shows that our methods allow for an important reduction in the preparation times of highly squeezed spin states.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Que ce soit d'un point de vue, urbanistique, social, ou encore de la gouvernance, l'évolution des villes est un défi majeur de nos sociétés contemporaines. En offrant la possibilité d'analyser des configurations spatiales et sociales existantes ou en tentant de simuler celles à venir, les systèmes d'information géographique sont devenus incontournables dans la gestion et dans la planification urbaine. En cinq ans la population de la ville de Lausanne est passée de 134'700 à 140'570 habitants, alors que les effectifs de l'école publique ont crû de 12'200 à 13'500 élèves. Cet accroissement démographique associé à un vaste processus d'harmonisation de la scolarité obligatoire en Suisse ont amené le Service des écoles à mettre en place et à développer en collaboration avec l'université de Lausanne des solutions SIG à même de répondre à différentes problématiques spatiales. Établies en 1989, les limites des établissements scolaires (bassins de recrutement) ont dû être redéfinies afin de les réadapter aux réalités d'un paysage urbain et politique en pleine mutation. Dans un contexte de mobilité et de durabilité, un système d'attribution de subventions pour les transports publics basé sur la distance domicile-école et sur l'âge des écoliers, a été conçu. La réalisation de ces projets a nécessité la construction de bases de données géographiques ainsi que l'élaboration de nouvelles méthodes d'analyses exposées dans ce travail. Cette thèse s'est ainsi faite selon une dialectique permanente entre recherches théoriques et nécessités pratiques. La première partie de ce travail porte sur l'analyse du réseau piéton de la ville. La morphologie du réseau est investiguée au travers d'approches multi-échelles du concept de centralité. La première conception, nommée sinuo-centralité ("straightness centrality"), stipule qu'être central c'est être relié aux autres en ligne droite. La deuxième, sans doute plus intuitive, est intitulée centricité ("closeness centrality") et exprime le fait qu'être central c'est être proche des autres (fig. 1, II). Les méthodes développées ont pour but d'évaluer la connectivité et la marchabilité du réseau, tout en suggérant de possibles améliorations (création de raccourcis piétons). Le troisième et dernier volet théorique expose et développe un algorithme de transport optimal régularisé. En minimisant la distance domicile-école et en respectant la taille des écoles, l'algorithme permet de réaliser des scénarios d'enclassement. L'implémentation des multiplicateurs de Lagrange offre une visualisation du "coût spatial" des infrastructures scolaires et des lieux de résidence des écoliers. La deuxième partie de cette thèse retrace les aspects principaux de trois projets réalisés dans le cadre de la gestion scolaire. À savoir : la conception d'un système d'attribution de subventions pour les transports publics, la redéfinition de la carte scolaire, ou encore la simulation des flux d'élèves se rendant à l'école à pied. *** May it be from an urbanistic, a social or from a governance point of view, the evolution of cities is a major challenge in our contemporary societies. By giving the opportunity to analyse spatial and social configurations or attempting to simulate future ones, geographic information systems cannot be overlooked in urban planning and management. In five years, the population of the city of Lausanne has grown from 134'700 to 140'570 inhabitants while the numbers in public schools have increased from 12'200 to 13'500 students. Associated to a considerable harmonisation process of compulsory schooling in Switzerland, this demographic rise has driven schooling services, in collaboration with the University of Lausanne, to set up and develop GIS capable of tackling various spatial issues. Established in 1989, the school districts had to be altered so that they might fit the reality of a continuously changing urban and political landscape. In a context of mobility and durability, an attribution system for public transport subventions based on the distance between residence and school and on the age of the students was designed. The implementation of these projects required the built of geographical databases as well as the elaboration of new analysis methods exposed in this thesis. The first part of this work focuses on the analysis of the city's pedestrian network. Its morphology is investigated through multi-scale approaches of the concept of centrality. The first conception, named the straightness centrality, stipulates that being central is being connected to the others in a straight line. The second, undoubtedly more intuitive, is called closeness centrality and expresses the fact that being central is being close to the others. The goal of the methods developed is to evaluate the connectivity and walkability of the network along with suggesting possible improvements (creation of pedestrian shortcuts).The third and final theoretical section exposes and develops an algorithm of regularised optimal transport. By minimising home to school distances and by respecting school capacity, the algorithm enables the production of student allocation scheme. The implementation of the Lagrange multipliers offers a visualisation of the spatial cost associated to the schooling infrastructures and to the student home locations. The second part of this thesis recounts the principal aspects of three projects fulfilled in the context of school management. It focuses namely on the built of an attribution system for public transport subventions, a school redistricting process and on simulating student pedestrian flows.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Machine learning provides tools for automated construction of predictive models in data intensive areas of engineering and science. The family of regularized kernel methods have in the recent years become one of the mainstream approaches to machine learning, due to a number of advantages the methods share. The approach provides theoretically well-founded solutions to the problems of under- and overfitting, allows learning from structured data, and has been empirically demonstrated to yield high predictive performance on a wide range of application domains. Historically, the problems of classification and regression have gained the majority of attention in the field. In this thesis we focus on another type of learning problem, that of learning to rank. In learning to rank, the aim is from a set of past observations to learn a ranking function that can order new objects according to how well they match some underlying criterion of goodness. As an important special case of the setting, we can recover the bipartite ranking problem, corresponding to maximizing the area under the ROC curve (AUC) in binary classification. Ranking applications appear in a large variety of settings, examples encountered in this thesis include document retrieval in web search, recommender systems, information extraction and automated parsing of natural language. We consider the pairwise approach to learning to rank, where ranking models are learned by minimizing the expected probability of ranking any two randomly drawn test examples incorrectly. The development of computationally efficient kernel methods, based on this approach, has in the past proven to be challenging. Moreover, it is not clear what techniques for estimating the predictive performance of learned models are the most reliable in the ranking setting, and how the techniques can be implemented efficiently. The contributions of this thesis are as follows. First, we develop RankRLS, a computationally efficient kernel method for learning to rank, that is based on minimizing a regularized pairwise least-squares loss. In addition to training methods, we introduce a variety of algorithms for tasks such as model selection, multi-output learning, and cross-validation, based on computational shortcuts from matrix algebra. Second, we improve the fastest known training method for the linear version of the RankSVM algorithm, which is one of the most well established methods for learning to rank. Third, we study the combination of the empirical kernel map and reduced set approximation, which allows the large-scale training of kernel machines using linear solvers, and propose computationally efficient solutions to cross-validation when using the approach. Next, we explore the problem of reliable cross-validation when using AUC as a performance criterion, through an extensive simulation study. We demonstrate that the proposed leave-pair-out cross-validation approach leads to more reliable performance estimation than commonly used alternative approaches. Finally, we present a case study on applying machine learning to information extraction from biomedical literature, which combines several of the approaches considered in the thesis. The thesis is divided into two parts. Part I provides the background for the research work and summarizes the most central results, Part II consists of the five original research articles that are the main contribution of this thesis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The goal of this study was to explore and understand the definition of technical debt. Technical debt refers to situation in a software development, where shortcuts or workarounds are taken in technical decision. However, the original definition has been applied to other parts of software development and it is currently difficult to define technical debt. We used mapping study process as a research methodology to collect literature related to the research topic. We collected 159 papers that referred to original definition of technical debt, which were retrieved from scientific literature databases to conduct the search process. We retrieved 107 definitions that were split into keywords. The keyword map is one of the main results of this work. Apart from that, resulting synonyms and different types of technical debt were analyzed and added to the map as branches. Overall, 33 keywords or phrases, 6 synonyms and 17 types of technical debt were distinguished.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Personalized medicine will revolutionize our capabilities to combat disease. Working toward this goal, a fundamental task is the deciphering of geneticvariants that are predictive of complex diseases. Modern studies, in the formof genome-wide association studies (GWAS) have afforded researchers with the opportunity to reveal new genotype-phenotype relationships through the extensive scanning of genetic variants. These studies typically contain over half a million genetic features for thousands of individuals. Examining this with methods other than univariate statistics is a challenging task requiring advanced algorithms that are scalable to the genome-wide level. In the future, next-generation sequencing studies (NGS) will contain an even larger number of common and rare variants. Machine learning-based feature selection algorithms have been shown to have the ability to effectively create predictive models for various genotype-phenotype relationships. This work explores the problem of selecting genetic variant subsets that are the most predictive of complex disease phenotypes through various feature selection methodologies, including filter, wrapper and embedded algorithms. The examined machine learning algorithms were demonstrated to not only be effective at predicting the disease phenotypes, but also doing so efficiently through the use of computational shortcuts. While much of the work was able to be run on high-end desktops, some work was further extended so that it could be implemented on parallel computers helping to assure that they will also scale to the NGS data sets. Further, these studies analyzed the relationships between various feature selection methods and demonstrated the need for careful testing when selecting an algorithm. It was shown that there is no universally optimal algorithm for variant selection in GWAS, but rather methodologies need to be selected based on the desired outcome, such as the number of features to be included in the prediction model. It was also demonstrated that without proper model validation, for example using nested cross-validation, the models can result in overly-optimistic prediction accuracies and decreased generalization ability. It is through the implementation and application of machine learning methods that one can extract predictive genotype–phenotype relationships and biological insights from genetic data sets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introducción: La morbilidad materna extrema es un término usado para definir cualquier condición obstétrica severa que amenaza la vida y requiere una intervención médica urgente con el fin de prevenir la probable muerte materna. Con el presente estudio se pretendió evaluar los factores de riesgo para morbilidad materna extrema en las gestantes del Hospital Universitario Mayor. Metodología Se realizó un estudio de casos y controles, comparando pacientes con MME y sin MME en una relación de 1:1. Se realizó un muestreo aleatorio simple teniendo en cuenta 95% de la población apareadas por diagnóstico de ingreso. Resultados Se incluyeron un total de 110 pacientes (55 en cada grupo). Ambas poblaciones fueron comprables. Ser de estrato socioeconómico bajo (p 0,000), haber tenido 2 o menos partos (p 0,000), ser tipo de sangre negativo (p0.000) realizar entre 0-3 controles prenatales (p 0,000), tener antecedente de preeclampsia (p 0,000), hipotiroidismo (p 0,000), o trastorno bipolar (p 0,000), son factores de riesgo significativos para presentar MME. Entre los factores protectores están tener más de tres partos OR 0,60 (IC95%: 0,17-0,82, p=0,00) y 7 o más controles prenatales OR 0,23 (IC95%: 0,09-0,55, p=0,000). Resultados concordantes con la literatura Discusión: Es importante dar a conocer los resultados del presente estudio para promover las campañas de prevención primaria, secundaria y terciaria con el fin de evitar las altas complicaciones que se pueden presentar en las mujeres en edad fértil de nuestra población.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For many networks in nature, science and technology, it is possible to order the nodes so that most links are short-range, connecting near-neighbours, and relatively few long-range links, or shortcuts, are present. Given a network as a set of observed links (interactions), the task of finding an ordering of the nodes that reveals such a range-dependent structure is closely related to some sparse matrix reordering problems arising in scientific computation. The spectral, or Fiedler vector, approach for sparse matrix reordering has successfully been applied to biological data sets, revealing useful structures and subpatterns. In this work we argue that a periodic analogue of the standard reordering task is also highly relevant. Here, rather than encouraging nonzeros only to lie close to the diagonal of a suitably ordered adjacency matrix, we also allow them to inhabit the off-diagonal corners. Indeed, for the classic small-world model of Watts & Strogatz (1998, Collective dynamics of ‘small-world’ networks. Nature, 393, 440–442) this type of periodic structure is inherent. We therefore devise and test a new spectral algorithm for periodic reordering. By generalizing the range-dependent random graph class of Grindrod (2002, Range-dependent random graphs and their application to modeling large small-world proteome datasets. Phys. Rev. E, 66, 066702-1–066702-7) to the periodic case, we can also construct a computable likelihood ratio that suggests whether a given network is inherently linear or periodic. Tests on synthetic data show that the new algorithm can detect periodic structure, even in the presence of noise. Further experiments on real biological data sets then show that some networks are better regarded as periodic than linear. Hence, we find both qualitative (reordered networks plots) and quantitative (likelihood ratios) evidence of periodicity in biological networks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Millions of unconscious calculations are made daily by pedestrians walking through the Colby College campus. I used ArcGIS to make a predictive spatial model that chose paths similar to those that are actually used by people on a regular basis. To make a viable model of how most travelers choose their way, I considered both the distance required and the type of traveling surface. I used an iterative process to develop a scheme for weighting travel costs which resulted in accurate least-cost paths to be predicted by ArcMap. The accuracy was confirmed when the calculated routes were compared to satellite photography and were found to overlap well-worn “shortcuts” taken between the paved paths throughout campus.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este trabalho tem por objetivo estudar a tomada de decisão dos indivíduos de diferentes nacionalidades, que atuam na gestão de projetos organizacionais, em sua vida fora do âmbito profissional. Dado que as metodologias existentes na área de gestão de projetos atentam para a necessidade de um processo decisório racional, lógico e objetivo, este estudo pretende explorar até que ponto os sujeitos organizacionais extrapolam este mesmo processo decisório linear, advindo do mundo profissional, para o seu cotidiano. Os estudos acadêmicos ao longo dos anos trataram de discutir esta temática da decisão racional, linear e lógica, os quais foram capazes de refutar esta hipótese com novas perspectivas para o julgamento cognitivo dos humanos. Portanto, além deste trabalho apresentar o campo de estudo da gerência de projetos e seus conceitos, ele também aborda as diversas evoluções teóricas acerca da tomada de decisão ao longo do tempo. A partir da consideração do caráter subjetivo nas teorias de decisão apresentadas, e a limitação cognitiva que muitas vezes se impõe, este estudo busca então explorar as diferentes heurísticas (estratégias simplificadoras, atalhos mentais) de julgamento e seus respectivos vieses cognitivos. As três principais meta-heurísticas, expostas por Tversky e Kahneman em seu trabalho acadêmico de 1974 e também foco deste estudo são, respectivamente: da representatividade, da disponibilidade e da ancoragem e ajustamento. Neste trabalho é realizada uma pesquisa quantitativa com sujeitos organizacionais que trabalham com gestão de projetos, ou que tiveram alguma experiência em algum projeto nas empresas em que trabalham. Ressalta-se que este estudo não se limita ao Brasil, extendendo-se também a outros países com o mesmo público-alvo de pesquisa. Os resultados da pesquisa revelaram que os profissionais que atuam em gestão de projetos estão sujeitos a vieses cognitivos fora do âmbito organizacional, sendo que os brasileiros são os menos propensos a estes vieses, em comparação com as demais nacionalidades estudadas. Também revelou-se que o tempo de experiência profissional não contribui de modo significante para uma tomada de decisão mais racional e lógica no cotidiano pessoal.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Decision makers often use ‘rules of thumb’, or heuristics, to help them handling decision situations (Kahneman and Tversky, 1979b). Those cognitive shortcuts are taken by the brain to cope with complexity and time limitation of decisions, by reducing the burden of information processing (Hodgkinson et al, 1999; Newell and Simon, 1972). Although crucial for decision-making, heuristics come at the cost of occasionally sending us off course, that is, make us fall into judgment traps (Tversky and Kahneman, 1974). Over fifty years of psychological research has shown that heuristics can lead to systematic errors, or biases, in decision-making. This study focuses on two particularly impactful biases to decision-making – the overconfidence and confirmation biases. A specific group – top management school students and recent graduates - were subject to classic experiments to measure their level of susceptibility to those biases. This population is bound to take decision positions at companies, and eventually make decisions that will impact not only their companies but society at large. The results show that this population is strongly biased by overconfidence, but less so to the confirmation bias. No significant relationship between the level of susceptibility to the overconfidence and to the confirmation bias was found.