931 resultados para Longest path
Resumo:
Peer reviewed
Resumo:
Mémoire numérisé par la Direction des bibliothèques de l'Université de Montréal.
Resumo:
Precision medicine is an emerging approach to disease treatment and prevention that considers variability in patient genes, environment, and lifestyle. However, little has been written about how such research impacts emergency care. Recent advances in analytical techniques have made it possible to characterize patients in a more comprehensive and sophisticated fashion at the molecular level, promising highly individualized diagnosis and treatment. Among these techniques are various systematic molecular phenotyping analyses (e.g., genomics, transcriptomics, proteomics, and metabolomics). Although a number of emergency physicians use such techniques in their research, widespread discussion of these approaches has been lacking in the emergency care literature and many emergency physicians may be unfamiliar with them. In this article, we briefly review the underpinnings of such studies, note how they already impact acute care, discuss areas in which they might soon be applied, and identify challenges in translation to the emergency department (ED). While such techniques hold much promise, it is unclear whether the obstacles to translating their findings to the ED will be overcome in the near future. Such obstacles include validation, cost, turnaround time, user interface, decision support, standardization, and adoption by end-users.
Resumo:
The Central American Free Trade Agreement (CAFTA) has been a mixed blessing for economic development. While exports to the US economy have increased, dependency may hinder economic growth if countries do not diversify or upgrade before temporary provisions expire. This article evaluates the impact of the temporary Tariff Preference Levels (TPLs) granted to Nicaragua under CAFTA and the consequences of TPL expiration. Using trade statistics, country- and firm-level data from Nicaragua’s National Free Zones Commission (CNZF) and data from field research, we estimate Nicaragua’s apparel sector will contract as much as 30–40% after TPLs expire. Our analysis underscores how rules of origin and firm nationality affect where and how companies do business, and in so doing, often constrain sustainable export growth.
Resumo:
Free energy calculations are a computational method for determining thermodynamic quantities, such as free energies of binding, via simulation.
Currently, due to computational and algorithmic limitations, free energy calculations are limited in scope.
In this work, we propose two methods for improving the efficiency of free energy calculations.
First, we expand the state space of alchemical intermediates, and show that this expansion enables us to calculate free energies along lower variance paths.
We use Q-learning, a reinforcement learning technique, to discover and optimize paths at low computational cost.
Second, we reduce the cost of sampling along a given path by using sequential Monte Carlo samplers.
We develop a new free energy estimator, pCrooks (pairwise Crooks), a variant on the Crooks fluctuation theorem (CFT), which enables decomposition of the variance of the free energy estimate for discrete paths, while retaining beneficial characteristics of CFT.
Combining these two advancements, we show that for some test models, optimal expanded-space paths have a nearly 80% reduction in variance relative to the standard path.
Additionally, our free energy estimator converges at a more consistent rate and on average 1.8 times faster when we enable path searching, even when the cost of path discovery and refinement is considered.
Resumo:
Mémoire numérisé par la Direction des bibliothèques de l'Université de Montréal.
Resumo:
This paper is based on the novel use of a very high fidelity decimation filter chain for Electrocardiogram (ECG) signal acquisition and data conversion. The multiplier-free and multi-stage structure of the proposed filters lower the power dissipation while minimizing the circuit area which are crucial design constraints to the wireless noninvasive wearable health monitoring products due to the scarce operational resources in their electronic implementation. The decimation ratio of the presented filter is 128, working in tandem with a 1-bit 3rd order Sigma Delta (ΣΔ) modulator which achieves 0.04 dB passband ripples and -74 dB stopband attenuation. The work reported here investigates the non-linear phase effects of the proposed decimation filters on the ECG signal by carrying out a comparative study after phase correction. It concludes that the enhanced phase linearity is not crucial for ECG acquisition and data conversion applications since the signal distortion of the acquired signal, due to phase non-linearity, is insignificant for both original and phase compensated filters. To the best of the authors’ knowledge, being free of signal distortion is essential as this might lead to misdiagnosis as stated in the state of the art. This article demonstrates that with their minimal power consumption and minimal signal distortion features, the proposed decimation filters can effectively be employed in biosignal data processing units.
Resumo:
The response of the Gulf Stream (GS) system to atmospheric forcing is generally linked either to the basin-scale winds on the subtropical gyre or to the buoyancy forcing from the Labrador Sea. This study presents a multiscale synergistic perspective to describe the low-frequency response of the GS system. The authors identify dominant temporal variability in the North Atlantic Oscillation (NAO), in known indices of the GS path, and in the observed GS latitudes along its path derived from sea surface height (SSH) contours over the period 1993-2013. The analysis suggests that the signature of interannual variability changes along the stream's path from 75 degrees to 55 degrees W. From its separation at Cape Hatteras to the west of 65 degrees W, the variability of the GS is mainly in the near-decadal (7-10 years) band, which is missing to the east of 60 degrees W, where a new interannual (4-5 years) band peaks. The latter peak (4-5 years) was missing to the west of 65 degrees W. The region between 65 degrees and 60 degrees W seems to be a transition region. A 2-3-yr secondary peak was pervasive in all time series, including that for the NAO. This multiscale response of the GS system is supported by results from a basin-scale North Atlantic model. The near-decadal response can be attributed to similar forcing periods in the NAO signal; however, the interannual variability of 4-5 years in the eastern segment of the GS path is as yet unexplained. More numerical and observational studies are warranted to understand such causality.
Resumo:
The response of the Gulf Stream (GS) system to atmospheric forcing is generally linked either to the basin-scale winds on the subtropical gyre or to the buoyancy forcing from the Labrador Sea. This study presents a multiscale synergistic perspective to describe the low-frequency response of the GS system. The authors identify dominant temporal variability in the North Atlantic Oscillation (NAO), in known indices of the GS path, and in the observed GS latitudes along its path derived from sea surface height (SSH) contours over the period 1993-2013. The analysis suggests that the signature of interannual variability changes along the stream's path from 75 degrees to 55 degrees W. From its separation at Cape Hatteras to the west of 65 degrees W, the variability of the GS is mainly in the near-decadal (7-10 years) band, which is missing to the east of 60 degrees W, where a new interannual (4-5 years) band peaks. The latter peak (4-5 years) was missing to the west of 65 degrees W. The region between 65 degrees and 60 degrees W seems to be a transition region. A 2-3-yr secondary peak was pervasive in all time series, including that for the NAO. This multiscale response of the GS system is supported by results from a basin-scale North Atlantic model. The near-decadal response can be attributed to similar forcing periods in the NAO signal; however, the interannual variability of 4-5 years in the eastern segment of the GS path is as yet unexplained. More numerical and observational studies are warranted to understand such causality.
Resumo:
This research paper presents a five step algorithm to generate tool paths for machining Free form / Irregular Contoured Surface(s) (FICS) by adopting STEP-NC (AP-238) format. In the first step, a parametrized CAD model with FICS is created or imported in UG-NX6.0 CAD package. The second step recognizes the features and calculates a Closeness Index (CI) by comparing them with the B-Splines / Bezier surfaces. The third step utilizes the CI and extracts the necessary data to formulate the blending functions for identified features. In the fourth step Z-level 5 axis tool paths are generated by adopting flat and ball end mill cutters. Finally, in the fifth step, tool paths are integrated with STEP-NC format and validated. All these steps are discussed and explained through a validated industrial component.
Resumo:
This research paper presents the work on feature recognition, tool path data generation and integration with STEP-NC (AP-238 format) for features having Free form / Irregular Contoured Surface(s) (FICS). Initially, the FICS features are modelled / imported in UG CAD package and a closeness index is generated. This is done by comparing the FICS features with basic B-Splines / Bezier curves / surfaces. Then blending functions are caculated by adopting convolution theorem. Based on the blending functions, contour offsett tool paths are generated and simulated for 5 axis milling environment. Finally, the tool path (CL) data is integrated with STEP-NC (AP-238) format. The tool path algorithm and STEP- NC data is tested with various industrial parts through an automated UFUNC plugin.
Resumo:
Otto-von Guericke-Universität Magdeburg, Fakultät für Maschinenbau, Dissertation, 2016
Resumo:
The power of computer game technology is currently being harnessed to produce “serious games”. These “games” are targeted at the education and training marketplace, and employ various key game-engine components such as the graphics and physics engines to produce realistic “digital-world” simulations of the real “physical world”. Many approaches are driven by the technology and often lack a consideration of a firm pedagogical underpinning. The authors believe that an analysis and deployment of both the technological and pedagogical dimensions should occur together, with the pedagogical dimension providing the lead. This chapter explores the relationship between these two dimensions, and explores how “pedagogy may inform the use of technology”, how various learning theories may be mapped onto the use of the affordances of computer game engines. Autonomous and collaborative learning approaches are discussed. The design of a serious game is broken down into spatial and temporal elements. The spatial dimension is related to the theories of knowledge structures, especially “concept maps”. The temporal dimension is related to “experiential learning”, especially the approach of Kolb. The multi-player aspect of serious games is related to theories of “collaborative learning” which is broken down into a discussion of “discourse” versus “dialogue”. Several general guiding principles are explored, such as the use of “metaphor” (including metaphors of space, embodiment, systems thinking, the internet and emergence). The topological design of a serious game is also highlighted. The discussion of pedagogy is related to various serious games we have recently produced and researched, and is presented in the hope of informing the “serious game community”.
Resumo:
The work of cataloging and digitizing the Historical Ar-chive of the Prelature of Humahuaca, presents us with documen-tary mass, almost unused for historical research. Due to organiza-tional reasons, this documentary heritage was limited to consulta-tion of researchers. The development of “Documenta” project will allow us to know the contents of that file, get closer to these do-cuments for consultation and scientific production.
Resumo:
This study seeks to understand how the physiological constraints of diving may change on a daily and seasonal basis. Dive data were obtained from southern elephant seals (Mirounga leonina) from South Georgia using satellite relay data loggers. We analysed the longest (95th percentile) dive durations as proxies for physiological dive limits. A strong, significant relationship existed between the duration of these dives and the time of day and week of year in which they were performed. The depth of the deepest dives also showed a significant, but far less consistent, relationship with local time of day and season. Changes in the duration of the longest dives occurred irrespective of their depth. Dives were longest in the morning (04:00-12:00 h) and shortest in the evening (16:00-00:00 h). The size of the fluctuation varied among animals from 4.0 to 20.0 min. The daily pattern in dive depth was phase-shifted in relation to the diurnal rhythm in dive duration. Dives were deeper at midday and shallower around midnight. Greater daily changes in duration occurred in seals feeding in the open ocean than in those foraging on the continental shelf. The seasonal peak in the duration of the longest dives coincided with austral midwinter. The size of the increase in dive duration from autumn/spring to winter ranged from 11.5 to 30.0 min. Changes in depth of the longest dives were not consistently associated with particular times of year. The substantial diurnal and seasonal fluctuations in maximum dive duration may be a result of changes in the physiological capacity to remain submerged, in addition to temporal changes in the ecological constraints on dive behaviour. We speculate about the role of melatonin as a hormonal mediator of diving capability.