381 resultados para STRINGS
Resumo:
We define Picard cycles on each smooth three-sheeted Galois cover C of the Riemann sphere. The moduli space of all these algebraic curves is a nice Shimura surface, namely a symmetric quotient of the projective plane uniformized by the complex two-dimensional unit ball. We show that all Picard cycles on C form a simple orbit of the Picard modular group of Eisenstein numbers. The proof uses a special surface classification in connection with the uniformization of a classical Picard-Fuchs system. It yields an explicit symplectic representation of the braid groups (coloured or not) of four strings.
Resumo:
Battery energy storage systems have traditionally been manufactured using new batteries with a good reliability. The high cost of such a system has led to investigations of using second life transportation batteries to provide an alternative energy storage capability. However, the reliability and performance of these batteries is unclear and multi-modular power electronics with redundancy have been suggested as a means of helping with this issue. This paper reviews work already undertaken on battery failure rate to suggest suitable figures for use in reliability calculations. The paper then uses reliability analysis and a numerical example to investigate six different multi-modular topologies and suggests how the number of series battery strings and power electronic module redundancy should be determined for the lowest hardware cost using a numerical example. The results reveal that the cascaded dc-side modular with single inverter is the lowest cost solution for a range of battery failure rates.
Resumo:
* Work supported by the Lithuanian State Science and Studies Foundation.
Resumo:
* Supported by projects CCG08-UAM TIC-4425-2009 and TEC2007-68065-C03-02
Resumo:
We investigate the pattern-dependent decoding failures in full-field electronic dispersion compensation (EDC) by offline processing of experimental signals, and find that the performance of such an EDC receiver may be degraded by an isolated "1" bit surrounded by long strings of consecutive "0s". By reducing the probability of occurrence of this kind of isolated "1" and using a novel adaptive threshold decoding method, we greatly improve the compensation performance to achieve 10-Gb/s on-off keyed signal transmission over 496-km field-installed single-mode fiber without optical dispersion compensation.
Resumo:
Georgi Dimkov - Looking at the performance of a violonist we perceive that the four strings of the instrument produces tones different pitches. It is clear that the artist presses the strings on special places and that changes the pitch. These places are determined practically by the musicians. Is it possible to determine these places theoretically, from some abstract point of view? After the legend the first successive investigations in this field were done by Pythagoras. The development of the ideas for improvement and enlargement of the results of Pythagoras is the mane topic of the present paper.
Resumo:
Photovoltaic (PV) stations have been widely built in the world to utilize solar energy directly. In order to reduce the capital and operational costs, early fault diagnosis is playing an increasingly important role by enabling the long effective operation of PV arrays. This paper analyzes the terminal characteristics of faulty PV strings and arrays, and it develops a PV array fault diagnosis technique. The terminal current-voltage curve of a faulty PV array is divided into two sections, i.e., high-voltage and low-voltage fault diagnosis sections. The corresponding working points of healthy string modules and of healthy and faulty modules in an unhealthy string are then analyzed for each section. By probing into different working points, a faulty PV module can be located. The fault information is of critical importance for the maximum power point tracking and the array dynamical reconfiguration. Furthermore, the string current sensors can be eliminated, and the number of voltage sensors can be reduced by optimizing voltage sensor locations. Typical fault scenarios including monostring, multistring, and a partial shadow for a 1.6-kW 3 $times$ 3 PV array are presented and experimentally tested to confirm the effectiveness of the proposed fault diagnosis method.
Resumo:
Since the 1950s, the theory of deterministic and nondeterministic finite automata (DFAs and NFAs, respectively) has been a cornerstone of theoretical computer science. In this dissertation, our main object of study is minimal NFAs. In contrast with minimal DFAs, minimal NFAs are computationally challenging: first, there can be more than one minimal NFA recognizing a given language; second, the problem of converting an NFA to a minimal equivalent NFA is NP-hard, even for NFAs over a unary alphabet. Our study is based on the development of two main theories, inductive bases and partials, which in combination form the foundation for an incremental algorithm, ibas, to find minimal NFAs. An inductive basis is a collection of languages with the property that it can generate (through union) each of the left quotients of its elements. We prove a fundamental characterization theorem which says that a language can be recognized by an n-state NFA if and only if it can be generated by an n-element inductive basis. A partial is an incompletely-specified language. We say that an NFA recognizes a partial if its language extends the partial, meaning that the NFA’s behavior is unconstrained on unspecified strings; it follows that a minimal NFA for a partial is also minimal for its language. We therefore direct our attention to minimal NFAs recognizing a given partial. Combining inductive bases and partials, we generalize our characterization theorem, showing that a partial can be recognized by an n-state NFA if and only if it can be generated by an n-element partial inductive basis. We apply our theory to develop and implement ibas, an incremental algorithm that finds minimal partial inductive bases generating a given partial. In the case of unary languages, ibas can often find minimal NFAs of up to 10 states in about an hour of computing time; with brute-force search this would require many trillions of years.
Resumo:
Since the 1950s, the theory of deterministic and nondeterministic finite automata (DFAs and NFAs, respectively) has been a cornerstone of theoretical computer science. In this dissertation, our main object of study is minimal NFAs. In contrast with minimal DFAs, minimal NFAs are computationally challenging: first, there can be more than one minimal NFA recognizing a given language; second, the problem of converting an NFA to a minimal equivalent NFA is NP-hard, even for NFAs over a unary alphabet. Our study is based on the development of two main theories, inductive bases and partials, which in combination form the foundation for an incremental algorithm, ibas, to find minimal NFAs. An inductive basis is a collection of languages with the property that it can generate (through union) each of the left quotients of its elements. We prove a fundamental characterization theorem which says that a language can be recognized by an n-state NFA if and only if it can be generated by an n-element inductive basis. A partial is an incompletely-specified language. We say that an NFA recognizes a partial if its language extends the partial, meaning that the NFA's behavior is unconstrained on unspecified strings; it follows that a minimal NFA for a partial is also minimal for its language. We therefore direct our attention to minimal NFAs recognizing a given partial. Combining inductive bases and partials, we generalize our characterization theorem, showing that a partial can be recognized by an n-state NFA if and only if it can be generated by an n-element partial inductive basis. We apply our theory to develop and implement ibas, an incremental algorithm that finds minimal partial inductive bases generating a given partial. In the case of unary languages, ibas can often find minimal NFAs of up to 10 states in about an hour of computing time; with brute-force search this would require many trillions of years.
Resumo:
This study has as objective to investigate the identity constructions of Mossoró, at the statements of the string literature. Having as thread the episode of resistance of the city against the group of Lampião at June the thirteenth of nineteen twenty seven, our study built itself at the statements present in nine strings, produced between the space-time of 1927 and 2007, year that Mossoró celebrated eighty years of the episode. Aware that the theme extrapolated the limits of the means of communication of the time and it became part of the everyday of the people from Mossoró, producing at the collective memory the image of a resistance city at the street’s names, at companies’ names, at radios’ names with the “FM Resistance”, at the discourses of the politicians, at the city hall which name is “Palace of the Resistance”, the central question that guides our investigation congregates the discussion around the dialogical relations done at the statements about the theme in vogue. This bias, the research elected as categories of analysis the concept of social voices and chronology, considering that the different identities are produced according to placements made by the subjects, as well as, by the context of production. Admitted at the area of Applied Linguistics (AL) and its line language and pratic of social the research articulates the theorizations provided from the area of Cultural Studies (especially regarding identity) with the theoretical framework of the bakhitinian Circle (regarding the social-historical conception of language and in its dialogical character). The results indicate that even existing an axiological movement around the representations of Mossoró and of the episode of 1927, the statements of the strings converge to a hegemonic discourse, corroborating with the identity profile of resistance transported over eighty decades
Resumo:
Recognized for his relevant writing for the cello, Silvio Ferraz wrote , in 2012, Segundo Responsório for cello solo and chamber group which followed Responsório ao Vento , a version of the same piece for solo cello . The work is characterized by the idea of continuity of sound moving through different textures , timbres , dynamics and musical gestures. The composer uses extended techniques, such as large sections in sul tasto playing three strings simultaneously, trills of natural harmonics , muffled trills with natural harmonics , col legno batuto , different types of glissando and simultaneous sounds of harmonic and non harmonic notes corroborate to a wealth of sounds and layers that create different textures. This article investigates the relationship of the composer with the cello, and relates Responsório ao Vento to his other works and studies the influences of the composer addressing technical and interpretive aspects of the piece drawn from performance experiences.
Resumo:
The task of expression undertaken by the performer falls largely on the right hand of guitarist. Aware of this fact, past and present masters have left their contributions to the development of right hand technique. It is clear, with rare exceptions, that educational and interpretative proposals, so far, have addressed the attack on the strings from the flexion of the fingers. This work, however, presents a technical resource called imalt, including in the attack action, the extension movement. Some techniques used in specific circumstances, such as the dedillo, the alzapúa, the tremulo and the rasgueado also use extension movements in the attack. They are put in perspective with the imalt providing a panoramic view of their individual characteristics. The use of imalt in the traditional guitar repertoire is exemplified in Villa Lobos, Ponce and Brouwer. Three pieces were composed for this work: Shravana, Alegoria and Vandana. Compositional techniques such as melodic contour applying and ostinato have been reviewed and used in the preparation of these compositions. A detailed record of compositional trajectory is presented. Therefore, the Model for the Compositional Process Accompaniment according Silva (2007) is used. Some events that have left the imalt in evidence are reported, as the launch and distribution of the Compact Disc (CD) Imalt, publishing scores and interviews. Finally is presented concluding comments, pointing possibilities opened up by this work.
Resumo:
Our key contribution is a flexible, automated marking system that adds desirable functionality to existing E-Assessment systems. In our approach, any given E-Assessment system is relegated to a data-collection mechanism, whereas marking and the generation and distribution of personalised per-student feedback is handled separately by our own system. This allows content-rich Microsoft Word feedback documents to be generated and distributed to every student simultaneously according to a per-assessment schedule.
The feedback is adaptive in that it corresponds to the answers given by the student and provides guidance on where they may have gone wrong. It is not limited to simple multiple choice which are the most prescriptive question type offered by most E-Assessment Systems and as such most straightforward to mark consistently and provide individual per-alternative feedback strings. It is also better equipped to handle the use of mathematical symbols and images within the feedback documents which is more flexible than existing E-Assessment systems, which can only handle simple text strings.
As well as MCQs the system reliably and robustly handles Multiple Response, Text Matching and Numeric style questions in a more flexible manner than Questionmark: Perception and other E-Assessment Systems. It can also reliably handle multi-part questions where the response to an earlier question influences the answer to a later one and can adjust both scoring and feedback appropriately.
New question formats can be added at any time provided a corresponding marking method conforming to certain templates can also be programmed. Indeed, any question type for which a programmatic method of marking can be devised may be supported by our system. Furthermore, since the student’s response to each is question is marked programmatically, our system can be set to allow for minor deviations from the correct answer, and if appropriate award partial marks.
Resumo:
A área da Endodontia está em constante progresso. Os materiais utilizados nos instrumentos Endodônticos, primordialmente, eram construídos com base em cordas de piano. Seguiu-se uma fase em que estes eram de aço de carbono, mas sofriam corrosão significativa devido ao cloro presente no hipoclorito de sódio, bem como aos processos de esterilização a vapor. Foi necessário evoluir novamente e foram introduzidos os instrumentos de aço inoxidável. Estes apresentavam alta resistência e dureza, mas algumas desvantagens devido à falta de flexibilidade. Atualmente, os instrumentos de NiTi proporcionam uma melhor flexibilidade e efeito de memória de forma. A fratura de instrumentos em Endodontia pode ocorrer por dois grandes fatores: a torção e a flexão por fadiga cíclica, podendo também ser a conjugação de ambos. Fatores anatômicos, como a curvatura e a largura do canal ou outros fatores como ciclos de esterilização, número de usos, etc., podem influenciar uma fratura mais precoce dos instrumentos. A incidência da fratura de instrumentos, embora seja pouco frequente, pode ser reduzida a um mínimo absoluto se os clínicos usarem as características de torque e de stress adequadas. Um bom conhecimento dos procedimentos clínicos, da anatomia, dos materiais e a utilização de instrumentos como o microscópio podem ajudar a prevenir ou a resolver a fratura dos instrumentos. No entanto, a melhor forma de prevenir a fratura é a sua prevenção. A desinfeção é o procedimento mais importante para o sucesso de um tratamento Endodôntico, portanto para que isto seja possível, é necessária uma boa conformação canalar. A presença de um instrumento no interior do canal pode comprometer a desinfecção, especialmente caso tenha ocorrido numa fase precoce da preparação canalar. Aquando da fratura de um instrumento, deve-se refletir sobre os procedimentos a seguir, podendo-se optar por várias abordagens, nomeadamente pela manutenção do instrumento no canal e obturação incorporando o fragmento, pela remoção do segmento através de diversas técnicas (ultrassons ou técnicas de microtubos, etc.), e ainda pela realização do bypass ou pela cirurgia Endodôntica. Em última instância pode ser realizada a extração do elemento dentário.
Resumo:
Abstract Scheduling problems are generally NP-hard combinatorial problems, and a lot of research has been done to solve these problems heuristically. However, most of the previous approaches are problem-specific and research into the development of a general scheduling algorithm is still in its infancy. Mimicking the natural evolutionary process of the survival of the fittest, Genetic Algorithms (GAs) have attracted much attention in solving difficult scheduling problems in recent years. Some obstacles exist when using GAs: there is no canonical mechanism to deal with constraints, which are commonly met in most real-world scheduling problems, and small changes to a solution are difficult. To overcome both difficulties, indirect approaches have been presented (in [1] and [2]) for nurse scheduling and driver scheduling, where GAs are used by mapping the solution space, and separate decoding routines then build solutions to the original problem. In our previous indirect GAs, learning is implicit and is restricted to the efficient adjustment of weights for a set of rules that are used to construct schedules. The major limitation of those approaches is that they learn in a non-human way: like most existing construction algorithms, once the best weight combination is found, the rules used in the construction process are fixed at each iteration. However, normally a long sequence of moves is needed to construct a schedule and using fixed rules at each move is thus unreasonable and not coherent with human learning processes. When a human scheduler is working, he normally builds a schedule step by step following a set of rules. After much practice, the scheduler gradually masters the knowledge of which solution parts go well with others. He can identify good parts and is aware of the solution quality even if the scheduling process is not completed yet, thus having the ability to finish a schedule by using flexible, rather than fixed, rules. In this research we intend to design more human-like scheduling algorithms, by using ideas derived from Bayesian Optimization Algorithms (BOA) and Learning Classifier Systems (LCS) to implement explicit learning from past solutions. BOA can be applied to learn to identify good partial solutions and to complete them by building a Bayesian network of the joint distribution of solutions [3]. A Bayesian network is a directed acyclic graph with each node corresponding to one variable, and each variable corresponding to individual rule by which a schedule will be constructed step by step. The conditional probabilities are computed according to an initial set of promising solutions. Subsequently, each new instance for each node is generated by using the corresponding conditional probabilities, until values for all nodes have been generated. Another set of rule strings will be generated in this way, some of which will replace previous strings based on fitness selection. If stopping conditions are not met, the Bayesian network is updated again using the current set of good rule strings. The algorithm thereby tries to explicitly identify and mix promising building blocks. It should be noted that for most scheduling problems the structure of the network model is known and all the variables are fully observed. In this case, the goal of learning is to find the rule values that maximize the likelihood of the training data. Thus learning can amount to 'counting' in the case of multinomial distributions. In the LCS approach, each rule has its strength showing its current usefulness in the system, and this strength is constantly assessed [4]. To implement sophisticated learning based on previous solutions, an improved LCS-based algorithm is designed, which consists of the following three steps. The initialization step is to assign each rule at each stage a constant initial strength. Then rules are selected by using the Roulette Wheel strategy. The next step is to reinforce the strengths of the rules used in the previous solution, keeping the strength of unused rules unchanged. The selection step is to select fitter rules for the next generation. It is envisaged that the LCS part of the algorithm will be used as a hill climber to the BOA algorithm. This is exciting and ambitious research, which might provide the stepping-stone for a new class of scheduling algorithms. Data sets from nurse scheduling and mall problems will be used as test-beds. It is envisaged that once the concept has been proven successful, it will be implemented into general scheduling algorithms. It is also hoped that this research will give some preliminary answers about how to include human-like learning into scheduling algorithms and may therefore be of interest to researchers and practitioners in areas of scheduling and evolutionary computation. References 1. Aickelin, U. and Dowsland, K. (2003) 'Indirect Genetic Algorithm for a Nurse Scheduling Problem', Computer & Operational Research (in print). 2. Li, J. and Kwan, R.S.K. (2003), 'Fuzzy Genetic Algorithm for Driver Scheduling', European Journal of Operational Research 147(2): 334-344. 3. Pelikan, M., Goldberg, D. and Cantu-Paz, E. (1999) 'BOA: The Bayesian Optimization Algorithm', IlliGAL Report No 99003, University of Illinois. 4. Wilson, S. (1994) 'ZCS: A Zeroth-level Classifier System', Evolutionary Computation 2(1), pp 1-18.