781 resultados para Process-dissociation Framework
Resumo:
We have studied the excitation and dissociation processes of the molecule W(CO)(6) in collisions with low kinetic energy (3 keV) protons, monocharged fluorine, and chlorine ions using double charge transfer spectroscopy. By analyzing the kinetic energy loss of the projectile anions, we measured the excitation energy distribution of the produced transient dications W(CO)(6)(2+). By coincidence measurements between the anions and the stable or fragments of W(CO)(6)(2+), we determined the energy distribution for each dissociation channel. Based on the experimental data, the emission of the first CO was tentatively attributed to a nonstatistical direct dissociation process and the emission of the second or more CO ligands was attributed to the statistical dissociation processes. The dissociation energies for the successive breaking of the W-CO bond were estimated using a cascade model. The ratio between charge separation and evaporation (by the loss of CO+ and CO, respectively) channels was estimated to be 6% in the case of Cl+ impact. (C) 2011 American Institute of Physics. [doi: 10.1063/1.3523347]
Resumo:
Ultrasonic absorption coefficients were measured for butylamine in heavy water (D2O) in the frequency range from 0.8 to 220 MHz and at concentrations from 0.0278 to 2.5170 mol dm(-3) at 25 degrees C; two kinds of relaxation processes were observed. One was found in relatively dilute solutions (up to 0.5 mol dm(-3)), which was attributed to the hydrolysis of butylamine. In order to compare the results, absorption measurements were also carried out in light water (H2O). The rate and thermodynamic parameters were determined from the concentration dependence of the relaxation frequency and the maximum absorption per wavelength. The isotope effects on the diffusion-controlled reaction were estimated and the stability of the intermediate of the hydrolysis was considered while comparing it with the results for propylamine in H2O and D2O. Another relaxation process was observed at concentrations greater than 1 mol dm(-3) in D2O. In order to examine the solution characteristics, proton NMR measurements for butylamine were also carried out in D2O. The chemical shifts for the gamma- and delta-proton in butylamine molecule indicate the existence of an aggregate. From profiles of the concentration dependence of the relaxation frequency and the maximum absorption per wavelength of sound absorption, the source of the relaxation was attributed to an association-dissociation reaction, perhaps, associated with a hydrophobic interaction. The aggregation number, the forward and reverse rate constants and the standard volume change of the reaction were determined. It was concluded from a comparison with the results in H2O that the hydrophobic interaction of butylamine in D2O is stronger than that in H2O. Also, the isotope effect on this reaction was interpreted in terms of the solvent structure.
Resumo:
The multi-photon ionization process of the hydrogen-bond cluster of pyridine-methanol has been investigated using a conventional and reflectron time-of-flight mass spectrometer (RTOF-MS) at 355 and 266 nm laser wavelengths, respectively. The sequences of the protonated cluster ions (CH3OH)(n)H+ and (C5H5Nn)(CH3OH)(m)H+ (n = 1,2) were observed at both laser wavelengths, while the sequence of the cluster ions (CH3)OHn (H2O)H+ was observed only at 355 nm laser wavelength. The difference between the relative signal intensities of the protonated methanol cluster ions at different laser wavelengths is attributed to different photoionization mechanisms. Some nascent cluster ions in metastable states dissociated during free flight to the detector. The dissociation kinetics is also discussed. (C) 2000 Elsevier Science B.V.
Resumo:
Natural and human-made disasters cause on average 120,000 deaths and over US$140 billion in damage to property and infrastructure every year, with national, regional and international actors consistently responding to the humanitarian imperative to alleviate suffering wherever it may be found. Despite various attempts to codify international disaster laws since the 1920s, a right to humanitarian assistance remains contested, reflecting concerns regarding the relative importance of state sovereignty vis-à-vis individual rights under international law. However, the evolving acquis humanitaire of binding and non-binding normative standards for responses to humanitarian crises highlights the increasing focus on rights and responsibilities applicable in disasters; although the International Law Commission has also noted the difficulty of identifying lex lata and lex ferenda regarding the protection of persons in the event of disasters due to the “amorphous state of the law relating to international disaster response.” Therefore, using the conceptual framework of transnational legal process, this thesis analyses the evolving normative frameworks and standards for rights-holders and duty-bearers in disasters. Determining the process whereby rights are created and evolve, and their potential internalisation into domestic law and policy, provides a powerful analytical framework for examining the progress and challenges of developing accountable responses to major disasters.
Resumo:
Existing work in Computer Science and Electronic Engineering demonstrates that Digital Signal Processing techniques can effectively identify the presence of stress in the speech signal. These techniques use datasets containing real or actual stress samples i.e. real-life stress such as 911 calls and so on. Studies that use simulated or laboratory-induced stress have been less successful and inconsistent. Pervasive, ubiquitous computing is increasingly moving towards voice-activated and voice-controlled systems and devices. Speech recognition and speaker identification algorithms will have to improve and take emotional speech into account. Modelling the influence of stress on speech and voice is of interest to researchers from many different disciplines including security, telecommunications, psychology, speech science, forensics and Human Computer Interaction (HCI). The aim of this work is to assess the impact of moderate stress on the speech signal. In order to do this, a dataset of laboratory-induced stress is required. While attempting to build this dataset it became apparent that reliably inducing measurable stress in a controlled environment, when speech is a requirement, is a challenging task. This work focuses on the use of a variety of stressors to elicit a stress response during tasks that involve speech content. Biosignal analysis (commercial Brain Computer Interfaces, eye tracking and skin resistance) is used to verify and quantify the stress response, if any. This thesis explains the basis of the author’s hypotheses on the elicitation of affectively-toned speech and presents the results of several studies carried out throughout the PhD research period. These results show that the elicitation of stress, particularly the induction of affectively-toned speech, is not a simple matter and that many modulating factors influence the stress response process. A model is proposed to reflect the author’s hypothesis on the emotional response pathways relating to the elicitation of stress with a required speech content. Finally the author provides guidelines and recommendations for future research on speech under stress. Further research paths are identified and a roadmap for future research in this area is defined.
Resumo:
The main objective of this thesis is the critical analysis of the evolution of the criminal justice systems throughout the past decade, with special attention to the fight against transnational terrorism. It is evident – for any observer - that such threats and the associated risk that terrorism entails, has changed significantly throughout the past decade. This perception has generated answers – many times radical ones – by States, as they have committed themselves to warrant the safety of their populations and to ease a growing sentiment of social panic. This thesis seeks to analyse the characteristics of this new threat and the responses that States have developed in the fight against terrorism since 9/11, which have questioned some of the essential principles and values in place in their own legal systems. In such sense, freedom and security are placed into perspective throughout the analysis of the specific antiterrorist legal reforms of five different States: Israel, Portugal, Spain, the United Kingdom and the United States of America. On the other hand, in light of those antiterrorist reforms, it will be questioned if it is possible to speak of the emergence of a new system of criminal justice (and of a process of a convergence between common law and civil law systems), built upon a control and preventive security framework, significantly different from traditional models. Finally, this research project has the fundamental objective to contribute to a better understanding on the economic, social and civilization costs of those legal reforms regarding human rights, the rule of law and democracy in modern States.
Resumo:
Creativity is often defined as developing something novel or new, that fits its context, and has value. To achieve this, the creative process itself has gained increasing attention as organizational leaders seek competitive advantages through developing new products, services, process, or business models. In this paper, we explore the notion of the creative process as including a series of “filters” or ways to process information as being a critical component of the creative process. We use the metaphor of coffee making and filters because many of our examples come from Vietnam, which is one of the world’s top coffee exporters and which has created a coffee culture rivaling many other countries. We begin with a brief review of the creative process its connection to information processing, propose a tentative framework for integrating the two ideas, and provide examples of how it might work. We close with implications for further practical and theoretical directions for this idea.
Resumo:
BACKGROUND: A hierarchical taxonomy of organisms is a prerequisite for semantic integration of biodiversity data. Ideally, there would be a single, expansive, authoritative taxonomy that includes extinct and extant taxa, information on synonyms and common names, and monophyletic supraspecific taxa that reflect our current understanding of phylogenetic relationships. DESCRIPTION: As a step towards development of such a resource, and to enable large-scale integration of phenotypic data across vertebrates, we created the Vertebrate Taxonomy Ontology (VTO), a semantically defined taxonomic resource derived from the integration of existing taxonomic compilations, and freely distributed under a Creative Commons Zero (CC0) public domain waiver. The VTO includes both extant and extinct vertebrates and currently contains 106,947 taxonomic terms, 22 taxonomic ranks, 104,736 synonyms, and 162,400 cross-references to other taxonomic resources. Key challenges in constructing the VTO included (1) extracting and merging names, synonyms, and identifiers from heterogeneous sources; (2) structuring hierarchies of terms based on evolutionary relationships and the principle of monophyly; and (3) automating this process as much as possible to accommodate updates in source taxonomies. CONCLUSIONS: The VTO is the primary source of taxonomic information used by the Phenoscape Knowledgebase (http://phenoscape.org/), which integrates genetic and evolutionary phenotype data across both model and non-model vertebrates. The VTO is useful for inferring phenotypic changes on the vertebrate tree of life, which enables queries for candidate genes for various episodes in vertebrate evolution.
Resumo:
Mozambique, with approximately 0.4 physicians and 4.1 nurses per 10,000 people, has one of the lowest ratios of health care providers to population in the world. To rapidly scale up health care coverage, the Mozambique Ministry of Health has pushed for greater investment in training nonphysician clinicians, Tιcnicos de Medicina (TM). Based on identified gaps in TM clinical performance, the Ministry of Health requested technical assistance from the International Training and Education Center for Health (I-TECH) to revise the two-and-a-half-year preservice curriculum. A six-step process was used to revise the curriculum: (i) Conducting a task analysis, (ii) defining a new curriculum approach and selecting an integrated model of subject and competency-based education, (iii) revising and restructuring the 30-month course schedule to emphasize clinical skills, (iv) developing a detailed syllabus for each course, (v) developing content for each lesson, and (vi) evaluating implementation and integrating feedback for ongoing improvement. In May 2010, the Mozambique Minister of Health approved the revised curriculum, which is currently being implemented in 10 training institutions around the country. Key lessons learned: (i) Detailed assessment of training institutions' strengths and weaknesses should inform curriculum revision. (ii) Establishing a Technical Working Group with respected and motivated clinicians is key to promoting local buy-in and ownership. (iii) Providing ready-to-use didactic material helps to address some challenges commonly found in resource-limited settings. (iv) Comprehensive curriculum revision is an important first step toward improving the quality of training provided to health care providers in developing countries. Other aspects of implementation at training institutions and health care facilities must also be addressed to ensure that providers are adequately trained and equipped to provide quality health care services. This approach to curriculum revision and implementation teaches several key lessons, which may be applicable to preservice training programs in other less developed countries.
Resumo:
Abstract: New product design challenges, related to customer needs, product usage and environments, face companies when they expand their product offerings to new markets; Some of the main challenges are: the lack of quantifiable information, product experience and field data. Designing reliable products under such challenges requires flexible reliability assessment processes that can capture the variables and parameters affecting the product overall reliability and allow different design scenarios to be assessed. These challenges also suggest a mechanistic (Physics of Failure-PoF) reliability approach would be a suitable framework to be used for reliability assessment. Mechanistic Reliability recognizes the primary factors affecting design reliability. This research views the designed entity as a “system of components required to deliver specific operations”; it addresses the above mentioned challenges by; Firstly: developing a design synthesis that allows a descriptive operations/ system components relationships to be realized; Secondly: developing component’s mathematical damage models that evaluate components Time to Failure (TTF) distributions given: 1) the descriptive design model, 2) customer usage knowledge and 3) design material properties; Lastly: developing a procedure that integrates components’ damage models to assess the mechanical system’s reliability over time. Analytical and numerical simulation models were developed to capture the relationships between operations and components, the mathematical damage models and the assessment of system’s reliability. The process was able to affect the design form during the conceptual design phase by providing stress goals to meet component’s reliability target. The process was able to numerically assess the reliability of a system based on component’s mechanistic TTF distributions, besides affecting the design of the component during the design embodiment phase. The process was used to assess the reliability of an internal combustion engine manifold during design phase; results were compared to reliability field data and found to produce conservative reliability results. The research focused on mechanical systems, affected by independent mechanical failure mechanisms that are influenced by the design process. Assembly and manufacturing stresses and defects’ influences are not a focus of this research.
Resumo:
A modeling strategy is presented to solve the governing equations of fluid flow, temperature (with solidification), and stress in an integrated manner. These equations are discretized using finite volume methods on unstructured grids, which provide the capability to represent complex domains. Both the cell-centered and vertex-based forms of the finite volume discretization procedure are explained, and the overall integrated solution procedure using these techniques with suitable solvers is detailed. Two industrial processes, based on the casting of metals, are used to demonstrate the capabilities of the resultant modeling framework. This manufacturing process requires a high degree of coupling between the governing physical equations to accurately predict potential defects. Comparisons between model predictions and experimental observations are given.
Resumo:
In this paper, a Computational Fluid Dynamics framework is presented for the modelling of key processes which involve granular material (i.e. segregation, degradation, caking). Appropriate physical models and sophisticated algorithms have been developed for the correct representation of the different material components in a granular mixture. The various processes, which arise from the micromechanical properties of the different mixture species can be obtained and parametrised in a DEM / experimental framework, thus enabling the continuum theory to correctly account for the micromechanical properties of a granular system. The present study establishes the link between the micromechanics and continuum theory and demonstrates the model capabilities in simulations of processes which are of great importance to the process engineering industry and involve granular materials in complex geometries.
Resumo:
This paper suggests a possible framework for the encapsulation of the decision making process for the Waterime project. The final outcome maybe a computerised model, but the process advocated is not prescriptive, and involves the production of a "paper model" as mediating representation between the knowledge acquired and any computerised system. This paper model may suffice in terms of the project's goals.
Resumo:
This paper presents a generic framework that can be used to describe study plans using meta-data. The context of this research and associated technologies and standards is presented. The approach adopted here has been developed within the mENU project that aims to provide a model for a European Networked University. The methodology for the design of the generic Framework is discussed and the main design requirements are presented. The approach adopted was based on a set of templates containing meta-data required for the description of programs of study and consisting of generic building elements annotated appropriately. The process followed to develop the templates is presented together with a set of evaluation criteria to test the suitability of the approach. The templates structure is presented and example templates are shown. A first evaluation of the approach has shown that the proposed framework can provide a flexible and competent means for the generic description of study plans for the purposes of a networked university.
Resumo:
This paper presents a framework to integrate requirements management and design knowledge reuse. The research approach begins with a literature review in design reuse and requirements management to identify appropriate methods within each domain. A framework is proposed based on the identified requirements. The framework is then demonstrated using a case study example: vacuum pump design. Requirements are presented as a component of the integrated design knowledge framework. The proposed framework enables the application of requirements management as a dynamic process, including capture, analysis and recording of requirements. It takes account of the evolving requirements and the dynamic nature of the interaction between requirements and product structure through the various stages of product development.