882 resultados para Branch and bound algorithms
Resumo:
* The research is supported partly by INTAS: 04-77-7173 project, http://www.intas.be
Resumo:
This paper continues the author’s team research on development, implementation, and experimentation of a task-oriented environment for teaching and learning algorithms. This environment is a part of a large-scale environment for course teaching in different domains. The paper deals only with the UML project of the teaching team’s side of the environment.. The implementation of the project ideas is demonstrated on a WINDOWS-based environment’s prototype.
Resumo:
We present quasi-Monte Carlo analogs of Monte Carlo methods for some linear algebra problems: solving systems of linear equations, computing extreme eigenvalues, and matrix inversion. Reformulating the problems as solving integral equations with a special kernels and domains permits us to analyze the quasi-Monte Carlo methods with bounds from numerical integration. Standard Monte Carlo methods for integration provide a convergence rate of O(N^(−1/2)) using N samples. Quasi-Monte Carlo methods use quasirandom sequences with the resulting convergence rate for numerical integration as good as O((logN)^k)N^(−1)). We have shown theoretically and through numerical tests that the use of quasirandom sequences improves both the magnitude of the error and the convergence rate of the considered Monte Carlo methods. We also analyze the complexity of considered quasi-Monte Carlo algorithms and compare them to the complexity of the analogous Monte Carlo and deterministic algorithms.
Resumo:
This research evaluates pattern recognition techniques on a subclass of big data where the dimensionality of the input space (p) is much larger than the number of observations (n). Specifically, we evaluate massive gene expression microarray cancer data where the ratio κ is less than one. We explore the statistical and computational challenges inherent in these high dimensional low sample size (HDLSS) problems and present statistical machine learning methods used to tackle and circumvent these difficulties. Regularization and kernel algorithms were explored in this research using seven datasets where κ < 1. These techniques require special attention to tuning necessitating several extensions of cross-validation to be investigated to support better predictive performance. While no single algorithm was universally the best predictor, the regularization technique produced lower test errors in five of the seven datasets studied.
Resumo:
In 1972 the ionized cluster beam (ICB) deposition technique was introduced as a new method for thin film deposition. At that time the use of clusters was postulated to be able to enhance film nucleation and adatom surface mobility, resulting in high quality films. Although a few researchers reported singly ionized clusters containing 10$\sp2$-10$\sp3$ atoms, others were unable to repeat their work. The consensus now is that film effects in the early investigations were due to self-ion bombardment rather than clusters. Subsequently in recent work (early 1992) synthesis of large clusters of zinc without the use of a carrier gas was demonstrated by Gspann and repeated in our laboratory. Clusters resulted from very significant changes in two source parameters. Crucible pressure was increased from the earlier 2 Torr to several thousand Torr and a converging-diverging nozzle 18 mm long and 0.4 mm in diameter at the throat was used in place of the 1 mm x 1 mm nozzle used in the early work. While this is practical for zinc and other high vapor pressure materials it remains impractical for many materials of industrial interest such as gold, silver, and aluminum. The work presented here describes results using gold and silver at pressures of around 1 and 50 Torr in order to study the effect of the pressure and nozzle shape. Significant numbers of large clusters were not detected. Deposited films were studied by atomic force microscopy (AFM) for roughness analysis, and X-ray diffraction.^ Nanometer size islands of zinc deposited on flat silicon substrates by ICB were also studied by atomic force microscopy and the number of atoms/cm$\sp2$ was calculated and compared to data from Rutherford backscattering spectrometry (RBS). To improve the agreement between data from AFM and RBS, convolution and deconvolution algorithms were implemented to study and simulate the interaction between tip and sample in atomic force microscopy. The deconvolution algorithm takes into account the physical volume occupied by the tip resulting in an image that is a more accurate representation of the surface.^ One method increasingly used to study the deposited films both during the growth process and following, is ellipsometry. Ellipsometry is a surface analytical technique used to determine the optical properties and thickness of thin films. In situ measurements can be made through the windows of a deposition chamber. A method for determining the optical properties of a film, that is sensitive only to the growing film and accommodates underlying interfacial layers, multiple unknown underlayers, and other unknown substrates was developed. This method is carried out by making an initial ellipsometry measurement well past the real interface and by defining a virtual interface in the vicinity of this measurement. ^
Resumo:
With the recent explosion in the complexity and amount of digital multimedia data, there has been a huge impact on the operations of various organizations in distinct areas, such as government services, education, medical care, business, entertainment, etc. To satisfy the growing demand of multimedia data management systems, an integrated framework called DIMUSE is proposed and deployed for distributed multimedia applications to offer a full scope of multimedia related tools and provide appealing experiences for the users. This research mainly focuses on video database modeling and retrieval by addressing a set of core challenges. First, a comprehensive multimedia database modeling mechanism called Hierarchical Markov Model Mediator (HMMM) is proposed to model high dimensional media data including video objects, low-level visual/audio features, as well as historical access patterns and frequencies. The associated retrieval and ranking algorithms are designed to support not only the general queries, but also the complicated temporal event pattern queries. Second, system training and learning methodologies are incorporated such that user interests are mined efficiently to improve the retrieval performance. Third, video clustering techniques are proposed to continuously increase the searching speed and accuracy by architecting a more efficient multimedia database structure. A distributed video management and retrieval system is designed and implemented to demonstrate the overall performance. The proposed approach is further customized for a mobile-based video retrieval system to solve the perception subjectivity issue by considering individual user's profile. Moreover, to deal with security and privacy issues and concerns in distributed multimedia applications, DIMUSE also incorporates a practical framework called SMARXO, which supports multilevel multimedia security control. SMARXO efficiently combines role-based access control (RBAC), XML and object-relational database management system (ORDBMS) to achieve the target of proficient security control. A distributed multimedia management system named DMMManager (Distributed MultiMedia Manager) is developed with the proposed framework DEMUR; to support multimedia capturing, analysis, retrieval, authoring and presentation in one single framework.
Resumo:
The applications of micro-end-milling operations have increased recently. A Micro-End-Milling Operation Guide and Research Tool (MOGART) package has been developed for the study and monitoring of micro-end-milling operations. It includes an analytical cutting force model, neural network based data mapping and forecasting processes, and genetic algorithms based optimization routines. MOGART uses neural networks to estimate tool machinability and forecast tool wear from the experimental cutting force data, and genetic algorithms with the analytical model to monitor tool wear, breakage, run-out, cutting conditions from the cutting force profiles. ^ The performance of MOGART has been tested on the experimental data of over 800 experimental cases and very good agreement has been observed between the theoretical and experimental results. The MOGART package has been applied to the micro-end-milling operation study of Engineering Prototype Center of Radio Technology Division of Motorola Inc. ^
Resumo:
This dissertation poses a set of six questions about one of the Israel Lobby's particular components, a Potential Christian Jewish coalition (PCJc) within American politics that advocates for Israeli sovereignty over "Judea and Samaria" ("the West Bank"). The study addresses: the profiles of the individuals of the PCJc; its policy positions, the issues that have divided it, and what has prevented, and continues to prevent, the coalition from being absorbed into one or more of the more formally organized components of the Israel Lobby; the resources and methods this coalition has used to attempt to influence U.S. policy on (a) the Middle East, and (b) the Arab-Israeli conflict in particular; the successes or failures of this coalition's advocacy and why it has not organized; and what this case reveals about interest group politics and social movements in the United States. This dissertation follows the descriptive-analytic case-study tradition that comprises a detailed analysis of a specific interest group and one policy issue, which conforms to my interest in the potential Christian Jewish coalition that supports a Jewish Judea and Samaria. I have employed participant observation, interviewing, content analysis and documentary research. The findings suggest: The PCJc consists of Christian Zionists and mostly Jews of the center religious denominations. Orthodox Jewish traditions of separation from Christians inhibit like-minded Christians and Jews from organizing. The PCJc opposes an Arab state in Judea and Samaria, and is not absorbed into more formally organized interest groups that support that policy. The PCJc's resources consist of support and funding from conservatives. Methods include use of education, debates and media. Members of the PCJc are successful because they persist in their support for a Jewish Judea and Samaria and meet through other organizations around Judeo-Christian values. The PCJc is deterred from advocacy and organization by a mobilization of bias from a subgovernment in Washington, D.C. comprising Congress, the Executive branch and lobby organizations. The study's results raise questions about interest group politics in America and the degree to which the U.S. political system is pluralistic, suggesting that executive power constrains the agenda to "safe" positions it favors.
Resumo:
The applications of micro-end-milling operations have increased recently. A Micro-End-Milling Operation Guide and Research Tool (MOGART) package has been developed for the study and monitoring of micro-end-milling operations. It includes an analytical cutting force model, neural network based data mapping and forecasting processes, and genetic algorithms based optimization routines. MOGART uses neural networks to estimate tool machinability and forecast tool wear from the experimental cutting force data, and genetic algorithms with the analytical model to monitor tool wear, breakage, run-out, cutting conditions from the cutting force profiles. The performance of MOGART has been tested on the experimental data of over 800 experimental cases and very good agreement has been observed between the theoretical and experimental results. The MOGART package has been applied to the micro-end-milling operation study of Engineering Prototype Center of Radio Technology Division of Motorola Inc.
Resumo:
Intriguing lattice dynamics has been predicted for aperiodic crystals that contain incommensurate substructures. Here we report inelastic neutron scattering measurements of phonon and magnon dispersions in Sr14Cu24O41, which contains incommensurate one-dimensional (1D) chain and two-dimensional (2D) ladder substructures. Two distinct acoustic phonon-like modes, corresponding to the sliding motion of one sublattice against the other, are observed for atomic motions polarized along the incommensurate axis. In the long wavelength limit, it is found that the sliding mode shows a remarkably small energy gap of 1.7-1.9 meV, indicating very weak interactions between the two incommensurate sublattices. The measurements also reveal a gapped and steep linear magnon dispersion of the ladder sublattice. The high group velocity of this magnon branch and weak coupling with acoustic phonons can explain the large magnon thermal conductivity in Sr14Cu24O41 crystals. In addition, the magnon specific heat is determined from the measured total specific heat and phonon density of states, and exhibits a Schottky anomaly due to gapped magnon modes of the spin chains. These findings offer new insights into the phonon and magnon dynamics and thermal transport properties of incommensurate magnetic crystals that contain low-dimensional substructures.
Resumo:
Free and "bound" long-chain alkenones (C37?2 and C37?3) in oxidized and unoxidized sections of four organic matter-rich Pliocene and Miocene Madeira Abyssal Plain turbidites (one from Ocean Drilling Program site 951B and three from site 952A) were analyzed to determine the effect of severe post depositional oxidation on the value of Uk'37. The profiles of both alkenones across the redox boundary show a preferential degradation of the C37?3 compared to the C37?2 compound. Because of the high initial Uk'37 values and the way of calculating the Uk'37 this degradation hardly influences the Uk'37 profiles. However, for lower Uk'37 values, measured selective degradation would increase Uk'37 up to 0.17 units, equivalent to 5°C. For most of the Uk'37 band-width, much smaller degradation already increases Uk'37 beyond the analytical error (0.017 units). Consequently, for interpreting the Uk'37 record in terms of past sea surface temperatures, selective degradation needs serious consideration.
Resumo:
Lors du transport du bois de la forêt vers les usines, de nombreux événements imprévus peuvent se produire, événements qui perturbent les trajets prévus (par exemple, en raison des conditions météo, des feux de forêt, de la présence de nouveaux chargements, etc.). Lorsque de tels événements ne sont connus que durant un trajet, le camion qui accomplit ce trajet doit être détourné vers un chemin alternatif. En l’absence d’informations sur un tel chemin, le chauffeur du camion est susceptible de choisir un chemin alternatif inutilement long ou pire, qui est lui-même "fermé" suite à un événement imprévu. Il est donc essentiel de fournir aux chauffeurs des informations en temps réel, en particulier des suggestions de chemins alternatifs lorsqu’une route prévue s’avère impraticable. Les possibilités de recours en cas d’imprévus dépendent des caractéristiques de la chaîne logistique étudiée comme la présence de camions auto-chargeurs et la politique de gestion du transport. Nous présentons trois articles traitant de contextes d’application différents ainsi que des modèles et des méthodes de résolution adaptés à chacun des contextes. Dans le premier article, les chauffeurs de camion disposent de l’ensemble du plan hebdomadaire de la semaine en cours. Dans ce contexte, tous les efforts doivent être faits pour minimiser les changements apportés au plan initial. Bien que la flotte de camions soit homogène, il y a un ordre de priorité des chauffeurs. Les plus prioritaires obtiennent les volumes de travail les plus importants. Minimiser les changements dans leurs plans est également une priorité. Étant donné que les conséquences des événements imprévus sur le plan de transport sont essentiellement des annulations et/ou des retards de certains voyages, l’approche proposée traite d’abord l’annulation et le retard d’un seul voyage, puis elle est généralisée pour traiter des événements plus complexes. Dans cette ap- proche, nous essayons de re-planifier les voyages impactés durant la même semaine de telle sorte qu’une chargeuse soit libre au moment de l’arrivée du camion à la fois au site forestier et à l’usine. De cette façon, les voyages des autres camions ne seront pas mo- difiés. Cette approche fournit aux répartiteurs des plans alternatifs en quelques secondes. De meilleures solutions pourraient être obtenues si le répartiteur était autorisé à apporter plus de modifications au plan initial. Dans le second article, nous considérons un contexte où un seul voyage à la fois est communiqué aux chauffeurs. Le répartiteur attend jusqu’à ce que le chauffeur termine son voyage avant de lui révéler le prochain voyage. Ce contexte est plus souple et offre plus de possibilités de recours en cas d’imprévus. En plus, le problème hebdomadaire peut être divisé en des problèmes quotidiens, puisque la demande est quotidienne et les usines sont ouvertes pendant des périodes limitées durant la journée. Nous utilisons un modèle de programmation mathématique basé sur un réseau espace-temps pour réagir aux perturbations. Bien que ces dernières puissent avoir des effets différents sur le plan de transport initial, une caractéristique clé du modèle proposé est qu’il reste valable pour traiter tous les imprévus, quelle que soit leur nature. En effet, l’impact de ces événements est capturé dans le réseau espace-temps et dans les paramètres d’entrée plutôt que dans le modèle lui-même. Le modèle est résolu pour la journée en cours chaque fois qu’un événement imprévu est révélé. Dans le dernier article, la flotte de camions est hétérogène, comprenant des camions avec des chargeuses à bord. La configuration des routes de ces camions est différente de celle des camions réguliers, car ils ne doivent pas être synchronisés avec les chargeuses. Nous utilisons un modèle mathématique où les colonnes peuvent être facilement et naturellement interprétées comme des itinéraires de camions. Nous résolvons ce modèle en utilisant la génération de colonnes. Dans un premier temps, nous relaxons l’intégralité des variables de décision et nous considérons seulement un sous-ensemble des itinéraires réalisables. Les itinéraires avec un potentiel d’amélioration de la solution courante sont ajoutés au modèle de manière itérative. Un réseau espace-temps est utilisé à la fois pour représenter les impacts des événements imprévus et pour générer ces itinéraires. La solution obtenue est généralement fractionnaire et un algorithme de branch-and-price est utilisé pour trouver des solutions entières. Plusieurs scénarios de perturbation ont été développés pour tester l’approche proposée sur des études de cas provenant de l’industrie forestière canadienne et les résultats numériques sont présentés pour les trois contextes.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
The objective of this dissertation is to explore a more accurate and versatile approach to investigating the neutralization of spores suffered from ultrafast heating and biocide based stresses, and further to explore and understand novel methods to supply ultrafast heating and biocides through nanostructured energetic materials A surface heating method was developed to apply accurate (± 25 ˚C), high heating rate thermal energy (200 - 800 ˚C, ~103 - ~105 ˚C/s). Uniform attachment of bacterial spores was achieved electrophoretically onto fine wires in liquids, which could be quantitatively detached into suspension for spore enumeration. The spore inactivation increased with temperature and heating rate, and fit a sigmoid response. The neutralization mechanisms of peak temperature and heating rate were correlated to the DNA damage at ~104 ˚C/s, and to the coat rupture by ultrafast vapor pressurization inside spores at ~105 ˚C/s. Humidity was found to have a synergistic effect of rapid heating and chlorine gas to neutralization efficiency. The primary neutralization mechanism of Cl2 and rapid heat is proposed to be chlorine reacting with the spore surface. The stress-kill correlation above provides guidance to explore new biocidal thermites, and to probe mechanisms. Results show that nano-Al/K2S2O8 released more gas at a lower temperature and generated a higher maximum pressure than the other nano-Al/oxysalts. Given that this thermite formulation generates the similar amount of SO2 as O2, it can be considered as a potential candidate for use in energetic biocidal applications. The reaction mechanisms of persulfate and other oxysalts containing thermites can be divided into two groups, with the reactive thermites (e.g. Al/K2S2O8) that generate ~10× higher of pressure and ~10× shorter of burn time ignited via a solid-gas Al/O2 reaction, while the less reactive thermites (e.g. Al/K2SO4) following a condensed phase Al/O reaction mechanism. These different ignition mechanisms were further re-evaluated by investigating the roles of free and bound oxygen. A constant critical reaction rate for ignition was found which is independent to ignition temperature, heating rate and free vs. bound oxygen.
Resumo:
A simple but efficient voice activity detector based on the Hilbert transform and a dynamic threshold is presented to be used on the pre-processing of audio signals -- The algorithm to define the dynamic threshold is a modification of a convex combination found in literature -- This scheme allows the detection of prosodic and silence segments on a speech in presence of non-ideal conditions like a spectral overlapped noise -- The present work shows preliminary results over a database built with some political speech -- The tests were performed adding artificial noise to natural noises over the audio signals, and some algorithms are compared -- Results will be extrapolated to the field of adaptive filtering on monophonic signals and the analysis of speech pathologies on futures works