136 resultados para Critical coupling parameter


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aquesta tesi explora la possibilitat de fer servir enllaços inductius per a una aplicació de l’automòbil on el cablejat entre la centraleta (ECU) i els sensors o detectors és difícil o impossible. S’han proposat dos mètodes: 1) el monitoratge de sensors commutats (dos possibles estats) via acoblament inductiu i 2) la transmissió mitjançant el mateix principi físic de la potència necessària per alimentar els sensors autònoms remots. La detecció d'ocupació i del cinturó de seguretat per a seients desmuntables pot ser implementada amb sistemes sense fils passius basats en circuits ressonants de tipus LC on l'estat dels sensors determina el valor del condensador i, per tant, la freqüència de ressonància. Els canvis en la freqüència són detectats per una bobina situada en el terra del vehicle. S’ha conseguit provar el sistema en un marge entre 0.5 cm i 3 cm. Els experiments s’han dut a terme fent servir un analitzador d’impedàncies connectat a una bobina primària i sensors comercials connectats a un circuit remot. La segona proposta consisteix en transmetre remotament la potència des d’una bobina situada en el terra del vehicle cap a un dispositiu autònom situat en el seient. Aquest dispositiu monitorarà l'estat dels detectors (d'ocupació i de cinturó) i transmetrà les dades mitjançant un transceptor comercial de radiofreqüència o pel mateix enllaç inductiu. S’han avaluat les bobines necessàries per a una freqüència de treball inferior a 150 kHz i s’ha estudiat quin és el regulador de tensió més apropiat per tal d’aconseguir una eficiència global màxima. Quatre tipus de reguladors de tensió s’han analitzat i comparat des del punt de vista de l’eficiència de potència. Els reguladors de tensió de tipus lineal shunt proporcionen una eficiència de potència millor que les altres alternatives, els lineals sèrie i els commutats buck o boost. Les eficiències aconseguides han estat al voltant del 40%, 25% i 10% per les bobines a distàncies 1cm, 1.5cm, i 2cm. Les proves experimentals han mostrat que els sensors autònoms han estat correctament alimentats fins a distàncies de 2.5cm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PROPÒSIT: Estudiar l'efecte de la cirurgia LASIK en la llum dispersa i la sensibilitat al contrast. MÈTODES: Vint-i-vuit pacients van ser tractats amb LASIK. La qualitat visual es va avaluar abans de l'operació i dos mesos després. RESULTATS: La mitjana de llum dispersa i la sensibilitat al contrast abans de l'operació no va canviar en dos mesos després. Només un ull tenia un marcat augment en la llum dispersa. Nou ulls van presentar una lleugera disminució en la sensibilitat al contrast. S'han trobat dues complicacions. CONCLUSIÓ: Després de LASIK la majoria dels pacients (80%) no van tenir complicacions i van mantenir la seva qualitat visual. Uns pocs pacients (16%) van tenir una mica de qualitat visual disminuïda. Molt pocs (4%) van tenir complicacions clíniques amb disminució en la qualitat visual.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the damage enhanced creep rupture of disordered materials by means of a fiber bundle model. Broken fibers undergo a slow stress relaxation modeled by a Maxwell element whose stress exponent m can vary in a broad range. Under global load sharing we show that due to the strength disorder of fibers, the lifetime ʧ of the bundle has sample-to-sample fluctuations characterized by a log-normal distribution independent of the type of disorder. We determine the Monkman-Grant relation of the model and establish a relation between the rupture life tʄ and the characteristic time tm of the intermediate creep regime of the bundle where the minimum strain rate is reached, making possible reliable estimates of ʧ from short term measurements. Approaching macroscopic failure, the deformation rate has a finite time power law singularity whose exponent is a decreasing function of m. On the microlevel the distribution of waiting times is found to have a power law behavior with m-dependent exponents different below and above the critical load of the bundle. Approaching the critical load from above, the cutoff value of the distributions has a power law divergence whose exponent coincides with the stress exponent of Maxwell elements

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Stone groundwood (SGW) is a fibrous matter commonly prepared in a high yield process, and mainly used for papermaking applications. In this work, the use of SGW fibers is explored as reinforcing element of polypropylene (PP) composites. Due to its chemical and superficial features, the use of coupling agents is needed for a good adhesion and stress transfer across the fiber-matrix interface. The intrinsic strength of the reinforcement is a key parameter to predict the mechanical properties of the composite and to perform an interface analysis. The main objective of the present work was the determination of the intrinsic tensile strength of stone groundwood fibers. Coupled and non-coupled PP composites from stone groundwood fibers were prepared. The influence of the surface morphology and the quality at interface on the final properties of the composite was analyzed and compared to that of fiberglass PP composites. The intrinsic tensile properties of stone groundwood fibers, as well as the fiber orientation factor and the interfacial shear strength of the current composites were determined

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The behavior of stone groundwood / polypropylene injection-molded composites was evaluated with and without coupling agent. Stone groundwood (SGW) is a fibrous material commonly prepared in a high yield process and mainly used for papermaking applications. In this work, the use of SGW fibers was explored as a reinforcing element of polypropylene (PP) composites. The surface charge density of the composite components was evaluated, as well as the fiber’s length and diameter inside the composite material. Two mixing extrusion processes were evaluated, and the use of a kinetic mixer, instead of an internal mixer, resulted in longer mean fiber lengths of the reinforcing fibers. On the other hand, the accessibility of surface hydroxyl groups of stone groundwood fibers was improved by treating the fibers with 5% of sodium hydroxide, resulting in a noticeable increase of the tensile strength of the composites, for a similar percentage of coupling agent. A new parameter called Fiber Tensile Strength Factor is defined and used as a baseline for the comparison of the properties of the different composite materials. Finally the competitiveness of stone groundwood / polypropylene / polypropylene-co-maleic anhydride system, which compared favorably to sized glass-fiber / polypropylene GF/PP and glass-fiber / polypropylene / polypropylene-co-maleic anhydride composite formulations, was quantified by means of the fiber tensile strength factor

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper deals with fault detection and isolation problems for nonlinear dynamic systems. Both problems are stated as constraint satisfaction problems (CSP) and solved using consistency techniques. The main contribution is the isolation method based on consistency techniques and uncertainty space refining of interval parameters. The major advantage of this method is that the isolation speed is fast even taking into account uncertainty in parameters, measurements, and model errors. Interval calculations bring independence from the assumption of monotony considered by several approaches for fault isolation which are based on observers. An application to a well known alcoholic fermentation process model is presented

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Different procedures to obtain atom condensed Fukui functions are described. It is shown how the resulting values may differ depending on the exact approach to atom condensed Fukui functions. The condensed Fukui function can be computed using either the fragment of molecular response approach or the response of molecular fragment approach. The two approaches are nonequivalent; only the latter approach corresponds in general with a population difference expression. The Mulliken approach does not depend on the approach taken but has some computational drawbacks. The different resulting expressions are tested for a wide set of molecules. In practice one must make seemingly arbitrary choices about how to compute condensed Fukui functions, which suggests questioning the role of these indicators in conceptual density-functional theory

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The computational approach to the Hirshfeld [Theor. Chim. Acta 44, 129 (1977)] atom in a molecule is critically investigated, and several difficulties are highlighted. It is shown that these difficulties are mitigated by an alternative, iterative version, of the Hirshfeld partitioning procedure. The iterative scheme ensures that the Hirshfeld definition represents a mathematically proper information entropy, allows the Hirshfeld approach to be used for charged molecules, eliminates arbitrariness in the choice of the promolecule, and increases the magnitudes of the charges. The resulting "Hirshfeld-I charges" correlate well with electrostatic potential derived atomic charges

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have implemented our new procedure for computing Franck-Condon factors utilizing vibrational configuration interaction based on a vibrational self-consistent field reference. Both Duschinsky rotations and anharmonic three-mode coupling are taken into account. Simulations of the first ionization band of Cl O2 and C4 H4 O (furan) using up to quadruple excitations in treating anharmonicity are reported and analyzed. A developer version of the MIDASCPP code was employed to obtain the required anharmonic vibrational integrals and transition frequencies

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Comparison of donor-acceptor electronic couplings calculated within two-state and three-state models suggests that the two-state treatment can provide unreliable estimates of Vda because of neglecting the multistate effects. We show that in most cases accurate values of the electronic coupling in a π stack, where donor and acceptor are separated by a bridging unit, can be obtained as Ṽ da = (E2 - E1) μ12 Rda + (2 E3 - E1 - E2) 2 μ13 μ23 Rda2, where E1, E2, and E3 are adiabatic energies of the ground, charge-transfer, and bridge states, respectively, μij is the transition dipole moments between the states i and j, and Rda is the distance between the planes of donor and acceptor. In this expression based on the generalized Mulliken-Hush approach, the first term corresponds to the coupling derived within a two-state model, whereas the second term is the superexchange correction accounting for the bridge effect. The formula is extended to bridges consisting of several subunits. The influence of the donor-acceptor energy mismatch on the excess charge distribution, adiabatic dipole and transition moments, and electronic couplings is examined. A diagnostic is developed to determine whether the two-state approach can be applied. Based on numerical results, we showed that the superexchange correction considerably improves estimates of the donor-acceptor coupling derived within a two-state approach. In most cases when the two-state scheme fails, the formula gives reliable results which are in good agreement (within 5%) with the data of the three-state generalized Mulliken-Hush model

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electronic coupling Vda is one of the key parameters that determine the rate of charge transfer through DNA. While there have been several computational studies of Vda for hole transfer, estimates of electronic couplings for excess electron transfer (ET) in DNA remain unavailable. In the paper, an efficient strategy is established for calculating the ET matrix elements between base pairs in a π stack. Two approaches are considered. First, we employ the diabatic-state (DS) method in which donor and acceptor are represented with radical anions of the canonical base pairs adenine-thymine (AT) and guanine-cytosine (GC). In this approach, similar values of Vda are obtained with the standard 6-31 G* and extended 6-31++ G* basis sets. Second, the electronic couplings are derived from lowest unoccupied molecular orbitals (LUMOs) of neutral systems by using the generalized Mulliken-Hush or fragment charge methods. Because the radical-anion states of AT and GC are well reproduced by LUMOs of the neutral base pairs calculated without diffuse functions, the estimated values of Vda are in good agreement with the couplings obtained for radical-anion states using the DS method. However, when the calculation of a neutral stack is carried out with diffuse functions, LUMOs of the system exhibit the dipole-bound character and cannot be used for estimating electronic couplings. Our calculations suggest that the ET matrix elements Vda for models containing intrastrand thymine and cytosine bases are essentially larger than the couplings in complexes with interstrand pyrimidine bases. The matrix elements for excess electron transfer are found to be considerably smaller than the corresponding values for hole transfer and to be very responsive to structural changes in a DNA stack

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problem of jointly estimating the number, the identities, and the data of active users in a time-varying multiuser environment was examined in a companion paper (IEEE Trans. Information Theory, vol. 53, no. 9, September 2007), at whose core was the use of the theory of finite random sets on countable spaces. Here we extend that theory to encompass the more general problem of estimating unknown continuous parameters of the active-user signals. This problem is solved here by applying the theory of random finite sets constructed on hybrid spaces. We doso deriving Bayesian recursions that describe the evolution withtime of a posteriori densities of the unknown parameters and data.Unlike in the above cited paper, wherein one could evaluate theexact multiuser set posterior density, here the continuous-parameter Bayesian recursions do not admit closed-form expressions. To circumvent this difficulty, we develop numerical approximationsfor the receivers that are based on Sequential Monte Carlo (SMC)methods (“particle filtering”). Simulation results, referring to acode-divisin multiple-access (CDMA) system, are presented toillustrate the theory.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

“Magic for a Pixeloscope” is a one hour show conceived to berepresented in a theater scenario that merges mixed and augmented reality (MR/AR) and full-body interaction with classical magic to create new tricks. The show was conceived by an interdisciplinary team composed by a magician, twointeraction designers, a theater director and a stage designer. Themagician uses custom based hardware and software to createnew illusions which are a starting point to explore new languagefor magical expression. In this paper we introduce a conceptualframework used to inform the design of different tricks; weexplore the design and production of some tricks included in theshow and we describe the feedback received on the world premiere and some of the conclusions obtained.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although both are fundamental terms in the humanities and social sciences, discourse and knowledge have seldom been explicitly related, and even less so in critical discourse studies. After a brief summary of what we know about these relationships in linguistics, psychology, epistemology and the social sciences, with special emphasis on the role of knowledge in the formation of mental models as a basis for discourse, I examine in more detail how a critical study of discourse and knowledge may be articulated in critical discourse studies. Thus, several areas of critical epistemic discourse analysis are identified, and then applied in a study of Tony Blair’s Iraq speech on March 18, 2003, in which he sought to legitimatize his decision to go to war in Iraq with George Bush. The analysis shows the various modes of how knowledge is managed and manipulated of all levels of discourse of this speech.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

From a managerial point of view, the more effcient, simple, and parameter-free (ESP) an algorithm is, the more likely it will be used in practice for solving real-life problems. Following this principle, an ESP algorithm for solving the Permutation Flowshop Sequencing Problem (PFSP) is proposed in this article. Using an Iterated Local Search (ILS) framework, the so-called ILS-ESP algorithm is able to compete in performance with other well-known ILS-based approaches, which are considered among the most effcient algorithms for the PFSP. However, while other similar approaches still employ several parameters that can affect their performance if not properly chosen, our algorithm does not require any particular fine-tuning process since it uses basic "common sense" rules for the local search, perturbation, and acceptance criterion stages of the ILS metaheuristic. Our approach defines a new operator for the ILS perturbation process, a new acceptance criterion based on extremely simple and transparent rules, and a biased randomization process of the initial solution to randomly generate different alternative initial solutions of similar quality -which is attained by applying a biased randomization to a classical PFSP heuristic. This diversification of the initial solution aims at avoiding poorly designed starting points and, thus, allows the methodology to take advantage of current trends in parallel and distributed computing. A set of extensive tests, based on literature benchmarks, has been carried out in order to validate our algorithm and compare it against other approaches. These tests show that our parameter-free algorithm is able to compete with state-of-the-art metaheuristics for the PFSP. Also, the experiments show that, when using parallel computing, it is possible to improve the top ILS-based metaheuristic by just incorporating to it our biased randomization process with a high-quality pseudo-random number generator.