862 resultados para Low Autocorrelation Binary Sequence Problem
Resumo:
A clínica da Obesidade Mórbida e a Cirurgia Bariátrica exige estudos e acompanhamentos do paciente. Os benefícios e riscos do emagrecimento por tratamento cirúrgico devem servir como ponto de alerta aos profissionais da saúde. O uso do questionário no serviço de psicologia é norteado pela escuta psicológica. Objetivos: 1) Descrever o perfil sócio-demográfico candidatos à cirurgia bariátrica. 2) Analisar a percepção dos pacientes sobre características de personalidade associadas à obesidade e transtornos alimentares. 3) Descrever os conteúdos psicodinâmicos da narrativa do sujeito e avaliar o sistema tensional inconsciente de dois pacientes por meio do Teste das Relações Objetais de Phillipson (TRO). Método: O delineamento metodológico com análise de dados pelo método epidemiológico e estudo de caso clínico, orientação psicanalítica. Na primeira etapa foram consultados 300 questionários do serviço de psicologia e na segunda dois pacientes com ganho de peso após 24 meses. São pacientes que procuraram tratamento em clínica especializada, em uma metrópole do sudeste brasileiro, sob consentimento pós-informado. Os questionários foram preenchidos por 227 mulheres e 73 homens; com média de idade igual a 36 anos; escolaridade ensino médio e superior, 53%; maioria casados; IMC entre grave e super mórbido (94,3%). Técnicas cirúrgicas indicadas Capella Bypass e Fobi-Capella (67%). Resultados: características psicológicas referidas pelos pacientes, a ansiedade apontou em 93,7% das respostas, seguidas por impulsividade, depressão, tolerância à frustração, baixa auto-estima, resolvedor de problemas dos outros (mais de 50%). No histórico familiar da obesidade está em mais de 70% depressão e uso do álcool em 30%; realização de psicoterapia (30%) e medicamentos para depressão e ansiedade (10%). Na segunda etapa, foi realizado o diagnóstico psicodinâmico, por meio do Teste das Relações Objetais de Phillipson com duas pacientes, cuja análise indicou necessidade de psicoterapia psicanalítica, pois tinham fixações na posição esquizoparanóide e apresentavam dificuldade em lidar com perdas e baixa motivação para mudança e insigth. Conclusões: Com a aplicação do questionário e o registro das observações empíricas, este questionário de entrevista semidirigida preenche condições de melhor acessar e avaliar os conteúdos revelados pelos pacientes. As contradições entre as respostas e o discurso, no contato individual com o psicólogo, apontam a necessidade de investimento no preparo do paciente para a cirurgia e mais acentuadamente o acompanhamento psicológico no primeiro ano do pós-operatório. Há um pensamento mágico a ser trabalhado durante a aplicação do questionário sobre as crenças frente à cirurgia e o emagrecimento e assim convocar o paciente a ocupar o lugar do sujeito implicado em seu processo pré e pós-operatório. O TRO contribuiu na compreensão do diagnóstico psicodinâmico de pacientes com ganho de peso após cirurgia e reforçou a necessidade de maior investimento no pré-operatório.(AU)
Resumo:
A clínica da Obesidade Mórbida e a Cirurgia Bariátrica exige estudos e acompanhamentos do paciente. Os benefícios e riscos do emagrecimento por tratamento cirúrgico devem servir como ponto de alerta aos profissionais da saúde. O uso do questionário no serviço de psicologia é norteado pela escuta psicológica. Objetivos: 1) Descrever o perfil sócio-demográfico candidatos à cirurgia bariátrica. 2) Analisar a percepção dos pacientes sobre características de personalidade associadas à obesidade e transtornos alimentares. 3) Descrever os conteúdos psicodinâmicos da narrativa do sujeito e avaliar o sistema tensional inconsciente de dois pacientes por meio do Teste das Relações Objetais de Phillipson (TRO). Método: O delineamento metodológico com análise de dados pelo método epidemiológico e estudo de caso clínico, orientação psicanalítica. Na primeira etapa foram consultados 300 questionários do serviço de psicologia e na segunda dois pacientes com ganho de peso após 24 meses. São pacientes que procuraram tratamento em clínica especializada, em uma metrópole do sudeste brasileiro, sob consentimento pós-informado. Os questionários foram preenchidos por 227 mulheres e 73 homens; com média de idade igual a 36 anos; escolaridade ensino médio e superior, 53%; maioria casados; IMC entre grave e super mórbido (94,3%). Técnicas cirúrgicas indicadas Capella Bypass e Fobi-Capella (67%). Resultados: características psicológicas referidas pelos pacientes, a ansiedade apontou em 93,7% das respostas, seguidas por impulsividade, depressão, tolerância à frustração, baixa auto-estima, resolvedor de problemas dos outros (mais de 50%). No histórico familiar da obesidade está em mais de 70% depressão e uso do álcool em 30%; realização de psicoterapia (30%) e medicamentos para depressão e ansiedade (10%). Na segunda etapa, foi realizado o diagnóstico psicodinâmico, por meio do Teste das Relações Objetais de Phillipson com duas pacientes, cuja análise indicou necessidade de psicoterapia psicanalítica, pois tinham fixações na posição esquizoparanóide e apresentavam dificuldade em lidar com perdas e baixa motivação para mudança e insigth. Conclusões: Com a aplicação do questionário e o registro das observações empíricas, este questionário de entrevista semidirigida preenche condições de melhor acessar e avaliar os conteúdos revelados pelos pacientes. As contradições entre as respostas e o discurso, no contato individual com o psicólogo, apontam a necessidade de investimento no preparo do paciente para a cirurgia e mais acentuadamente o acompanhamento psicológico no primeiro ano do pós-operatório. Há um pensamento mágico a ser trabalhado durante a aplicação do questionário sobre as crenças frente à cirurgia e o emagrecimento e assim convocar o paciente a ocupar o lugar do sujeito implicado em seu processo pré e pós-operatório. O TRO contribuiu na compreensão do diagnóstico psicodinâmico de pacientes com ganho de peso após cirurgia e reforçou a necessidade de maior investimento no pré-operatório.(AU)
Resumo:
We show experimentally and numerically that in high-speed strongly dispersion-managed standard fiber soliton systems nonlinear interactions limit the propagation distance. We present results that show that the effect of these interactions can be significantly reduced by appropriate location of the amplifier within the dispersion map. Using this technique, we have been able to extend the propagation distance of 10-Gbit/s 231–1pseudorandom binary sequence soliton data to 16, 500km over standard fiber by use of dispersion compensation. To our knowledge this distance is the farthest transmission over standard fiber without active control ever reported, and it was achieved with the amplifier placed after the dispersion-compensating fiber in a recirculating loop.
Resumo:
We show experimentally and numerically that in high-speed strongly dispersion-managed standard fiber soliton systems nonlinear interactions limit the propagation distance. We present results that show that the effect of these interactions can be significantly reduced by appropriate location of the amplifier within the dispersion map. Using this technique, we have been able to extend the propagation distance of 10-Gbit/s 231–1pseudorandom binary sequence soliton data to 16, 500km over standard fiber by use of dispersion compensation. To our knowledge this distance is the farthest transmission over standard fiber without active control ever reported, and it was achieved with the amplifier placed after the dispersion-compensating fiber in a recirculating loop.
Resumo:
We announce the discovery of a new low-mass, pre-main sequence eclipsing binary, MML 53. Previous observations of MML 53 found it to be a pre-main sequence spectroscopic multiple associated with the 15-22 Myr Upper Centaurus-Lupus cluster. We identify the object as an eclipsing binary for the first time through the analysis of multiple seasons of time series photometry from the SuperWASP transiting planet survey. Re-analysis of a single archive spectrum shows MML 53 to be a spatially unresolved triple system of young stars which all exhibit significant lithium absorption. Two of the components comprise an eclipsing binary with period, P = 2.097891(6) ± 0.000005 and mass ratio, q ~ 0.8. Here, we present the analysis of the discovery data.
Resumo:
We analyze the average performance of a general class of learning algorithms for the nondeterministic polynomial time complete problem of rule extraction by a binary perceptron. The examples are generated by a rule implemented by a teacher network of similar architecture. A variational approach is used in trying to identify the potential energy that leads to the largest generalization in the thermodynamic limit. We restrict our search to algorithms that always satisfy the binary constraints. A replica symmetric ansatz leads to a learning algorithm which presents a phase transition in violation of an information theoretical bound. Stability analysis shows that this is due to a failure of the replica symmetric ansatz and the first step of replica symmetry breaking (RSB) is studied. The variational method does not determine a unique potential but it allows construction of a class with a unique minimum within each first order valley. Members of this class improve on the performance of Gibbs algorithm but fail to reach the Bayesian limit in the low generalization phase. They even fail to reach the performance of the best binary, an optimal clipping of the barycenter of version space. We find a trade-off between a good low performance and early onset of perfect generalization. Although the RSB may be locally stable we discuss the possibility that it fails to be the correct saddle point globally. ©2000 The American Physical Society.
Resumo:
Stream ciphers are encryption algorithms used for ensuring the privacy of digital telecommunications. They have been widely used for encrypting military communications, satellite communications, pay TV encryption and for voice encryption of both fixed lined and wireless networks. The current multi year European project eSTREAM, which aims to select stream ciphers suitable for widespread adoptation, reflects the importance of this area of research. Stream ciphers consist of a keystream generator and an output function. Keystream generators produce a sequence that appears to be random, which is combined with the plaintext message using the output function. Most commonly, the output function is binary addition modulo two. Cryptanalysis of these ciphers focuses largely on analysis of the keystream generators and of relationships between the generator and the keystream it produces. Linear feedback shift registers are widely used components in building keystream generators, as the sequences they produce are well understood. Many types of attack have been proposed for breaking various LFSR based stream ciphers. A recent attack type is known as an algebraic attack. Algebraic attacks transform the problem of recovering the key into a problem of solving multivariate system of equations, which eventually recover the internal state bits or the key bits. This type of attack has been shown to be effective on a number of regularly clocked LFSR based stream ciphers. In this thesis, algebraic attacks are extended to a number of well known stream ciphers where at least one LFSR in the system is irregularly clocked. Applying algebriac attacks to these ciphers has only been discussed previously in the open literature for LILI-128. In this thesis, algebraic attacks are first applied to keystream generators using stop-and go clocking. Four ciphers belonging to this group are investigated: the Beth-Piper stop-and-go generator, the alternating step generator, the Gollmann cascade generator and the eSTREAM candidate: the Pomaranch cipher. It is shown that algebraic attacks are very effective on the first three of these ciphers. Although no effective algebraic attack was found for Pomaranch, the algebraic analysis lead to some interesting findings including weaknesses that may be exploited in future attacks. Algebraic attacks are then applied to keystream generators using (p; q) clocking. Two well known examples of such ciphers, the step1/step2 generator and the self decimated generator are investigated. Algebraic attacks are shown to be very powerful attack in recovering the internal state of these generators. A more complex clocking mechanism than either stop-and-go or the (p; q) clocking keystream generators is known as mutual clock control. In mutual clock control generators, the LFSRs control the clocking of each other. Four well known stream ciphers belonging to this group are investigated with respect to algebraic attacks: the Bilateral-stop-and-go generator, A5/1 stream cipher, Alpha 1 stream cipher, and the more recent eSTREAM proposal, the MICKEY stream ciphers. Some theoretical results with regards to the complexity of algebraic attacks on these ciphers are presented. The algebraic analysis of these ciphers showed that generally, it is hard to generate the system of equations required for an algebraic attack on these ciphers. As the algebraic attack could not be applied directly on these ciphers, a different approach was used, namely guessing some bits of the internal state, in order to reduce the degree of the equations. Finally, an algebraic attack on Alpha 1 that requires only 128 bits of keystream to recover the 128 internal state bits is presented. An essential process associated with stream cipher proposals is key initialization. Many recently proposed stream ciphers use an algorithm to initialize the large internal state with a smaller key and possibly publicly known initialization vectors. The effect of key initialization on the performance of algebraic attacks is also investigated in this thesis. The relationships between the two have not been investigated before in the open literature. The investigation is conducted on Trivium and Grain-128, two eSTREAM ciphers. It is shown that the key initialization process has an effect on the success of algebraic attacks, unlike other conventional attacks. In particular, the key initialization process allows an attacker to firstly generate a small number of equations of low degree and then perform an algebraic attack using multiple keystreams. The effect of the number of iterations performed during key initialization is investigated. It is shown that both the number of iterations and the maximum number of initialization vectors to be used with one key should be carefully chosen. Some experimental results on Trivium and Grain-128 are then presented. Finally, the security with respect to algebraic attacks of the well known LILI family of stream ciphers, including the unbroken LILI-II, is investigated. These are irregularly clock- controlled nonlinear filtered generators. While the structure is defined for the LILI family, a particular paramater choice defines a specific instance. Two well known such instances are LILI-128 and LILI-II. The security of these and other instances is investigated to identify which instances are vulnerable to algebraic attacks. The feasibility of recovering the key bits using algebraic attacks is then investigated for both LILI- 128 and LILI-II. Algebraic attacks which recover the internal state with less effort than exhaustive key search are possible for LILI-128 but not for LILI-II. Given the internal state at some point in time, the feasibility of recovering the key bits is also investigated, showing that the parameters used in the key initialization process, if poorly chosen, can lead to a key recovery using algebraic attacks.
Resumo:
In this paper, we propose a highly reliable fault diagnosis scheme for incipient low-speed rolling element bearing failures. The scheme consists of fault feature calculation, discriminative fault feature analysis, and fault classification. The proposed approach first computes wavelet-based fault features, including the respective relative wavelet packet node energy and entropy, by applying a wavelet packet transform to an incoming acoustic emission signal. The most discriminative fault features are then filtered from the originally produced feature vector by using discriminative fault feature analysis based on a binary bat algorithm (BBA). Finally, the proposed approach employs one-against-all multiclass support vector machines to identify multiple low-speed rolling element bearing defects. This study compares the proposed BBA-based dimensionality reduction scheme with four other dimensionality reduction methodologies in terms of classification performance. Experimental results show that the proposed methodology is superior to other dimensionality reduction approaches, yielding an average classification accuracy of 94.9%, 95.8%, and 98.4% under bearing rotational speeds at 20 revolutions-per-minute (RPM), 80 RPM, and 140 RPM, respectively.
Resumo:
The number of drug substances in formulation development in the pharmaceutical industry is increasing. Some of these are amorphous drugs and have glass transition below ambient temperature, and thus they are usually difficult to formulate and handle. One reason for this is the reduced viscosity, related to the stickiness of the drug, that makes them complicated to handle in unit operations. Thus, the aim in this thesis was to develop a new processing method for a sticky amorphous model material. Furthermore, model materials were characterised before and after formulation, using several characterisation methods, to understand more precisely the prerequisites for physical stability of amorphous state against crystallisation. The model materials used were monoclinic paracetamol and citric acid anhydrate. Amorphous materials were prepared by melt quenching or by ethanol evaporation methods. The melt blends were found to have slightly higher viscosity than the ethanol evaporated materials. However, melt produced materials crystallised more easily upon consecutive shearing than ethanol evaporated materials. The only material that did not crystallise during shearing was a 50/50 (w/w, %) blend regardless of the preparation method and it was physically stable at least two years in dry conditions. Shearing at varying temperatures was established to measure the physical stability of amorphous materials in processing and storage conditions. The actual physical stability of the blends was better than the pure amorphous materials at ambient temperature. Molecular mobility was not related to the physical stability of the amorphous blends, observed as crystallisation. Molecular mobility of the 50/50 blend derived from a spectral linewidth as a function of temperature using solid state NMR correlated better with the molecular mobility derived from a rheometer than that of differential scanning calorimetry data. Based on the results obtained, the effect of molecular interactions, thermodynamic driving force and miscibility of the blends are discussed as the key factors to stabilise the blends. The stickiness was found to be affected glass transition and viscosity. Ultrasound extrusion and cutting were successfully tested to increase the processability of sticky material. Furthermore, it was found to be possible to process the physically stable 50/50 blend in a supercooled liquid state instead of a glassy state. The method was not found to accelerate the crystallisation. This may open up new possibilities to process amorphous materials that are otherwise impossible to manufacture into solid dosage forms.
Resumo:
The low-frequency (5–100 kHz) dielectric constant ε has been measured in the temperature range 7 × 10−5 < T = (T − Tc)/Tc < 8 × 10−2. Near Tc an exponent ≈0.11 characterizes the power law behaviour of dε/dt consistent with the theoretically predicted t−α singularity. However, over the full range of t an exponent ≈0.35 is obtained.
Resumo:
The low-frequency (5–100 kHz) dielectric constant epsilon (Porson) has been measured in the temperature range 7 × 10−5 < t = (T − Tc)/Tc < 8 × 10−2. Near Tc an exponent ≈0.11 characterizes the power law behaviour of Image consistent with the theoretically predicted t−α singularity. However, over the full range of t an exponent ≈0.35 is obtained.
Resumo:
Expressed sequence tag (EST) databases provide a primary source of nuclear DNA sequences for genetic marker development in non-model organisms. To date, the process has been relatively inefficient for several reasons: - 1) priming site polymorphism in the template leads to inferior or erratic amplification; - 2) introns in the target amplicon are too large and/or numerous to allow effective amplification under standard screening conditions, and; - 3) at least occasionally, a PCR primer straddles an exon–intron junction and is unable to bind to genomic DNA template. The first is only a minor issue for species or strains with low heterozygosity but becomes a significant problem for species with high genomic variation, such as marine organisms with extremely large effective population sizes. Problems arising from unanticipated introns are unavoidable but are most pronounced in intron-rich species, such as vertebrates and lophotrochozoans. We present an approach to marker development in the Pacific oyster Crassostrea gigas, a highly polymorphic and intron-rich species, which minimizes these problems, and should be applicable to other non-model species for which EST databases are available. Placement of PCR primers in the 3′ end of coding sequence and 3′ UTR improved PCR success rate from 51% to 97%. Almost all (37 of 39) markers developed for the Pacific oyster were polymorphic in a small test panel of wild and domesticated oysters.
Resumo:
Non-government actors such as think-tanks are playing an important role in Australian policy work. As governments increasingly outsource policy work previously done by education departments and academics to these new policy actors, more think-tanks have emerged that represent a wide range of political views and ideological positions. This paper looks at the emergence of the Grattan Institute as one significant player in Australian education policy with a particular emphasis on Grattan’s report ‘Turning around low-performing schools’. Grattan exemplifies many of the facets of Barber’s ‘deliverology’, as they produce reports designed to be easily digested, simply actioned and provide reassurance that there is an answer, often through focusing on ‘what works’ recipes. ‘Turning around low-performing schools’ is a perfect example of this deliverology. However, a close analysis of the Report suggests that it contains four major problems which seriously impact its usefulness for schools and policymakers: it ignores data that may be more important in explaining the turn-around of schools, the Report is overly reliant on NAPLAN data, there are reasons to be suspicious about the evidence assembled, and finally the Report falls into a classic trap of logic—the post hoc fallacy.
Resumo:
Background: The number of available structures of large multi-protein assemblies is quite small. Such structures provide phenomenal insights on the organization, mechanism of formation and functional properties of the assembly. Hence detailed analysis of such structures is highly rewarding. However, the common problem in such analyses is the low resolution of these structures. In the recent times a number of attempts that combine low resolution cryo-EM data with higher resolution structures determined using X-ray analysis or NMR or generated using comparative modeling have been reported. Even in such attempts the best result one arrives at is the very course idea about the assembly structure in terms of trace of the C alpha atoms which are modeled with modest accuracy. Methodology/Principal Findings: In this paper first we present an objective approach to identify potentially solvent exposed and buried residues solely from the position of C alpha atoms and amino acid sequence using residue type-dependent thresholds for accessible surface areas of C alpha. We extend the method further to recognize potential protein-protein interface residues. Conclusion/Significance: Our approach to identify buried and exposed residues solely from the positions of C alpha atoms resulted in an accuracy of 84%, sensitivity of 83-89% and specificity of 67-94% while recognition of interfacial residues corresponded to an accuracy of 94%, sensitivity of 70-96% and specificity of 58-94%. Interestingly, detailed analysis of cases of mismatch between recognition of interface residues from C alpha positions and all-atom models suggested that, recognition of interfacial residues using C alpha atoms only correspond better with intuitive notion of what is an interfacial residue. Our method should be useful in the objective analysis of structures of protein assemblies when positions of only C alpha positions are available as, for example, in the cases of integration of cryo-EM data and high resolution structures of the components of the assembly.
Resumo:
Let G = (V,E) be a simple, finite, undirected graph. For S ⊆ V, let $\delta(S,G) = \{ (u,v) \in E : u \in S \mbox { and } v \in V-S \}$ and $\phi(S,G) = \{ v \in V -S: \exists u \in S$ , such that (u,v) ∈ E} be the edge and vertex boundary of S, respectively. Given an integer i, 1 ≤ i ≤ ∣ V ∣, the edge and vertex isoperimetric value at i is defined as b e (i,G) = min S ⊆ V; |S| = i |δ(S,G)| and b v (i,G) = min S ⊆ V; |S| = i |φ(S,G)|, respectively. The edge (vertex) isoperimetric problem is to determine the value of b e (i, G) (b v (i, G)) for each i, 1 ≤ i ≤ |V|. If we have the further restriction that the set S should induce a connected subgraph of G, then the corresponding variation of the isoperimetric problem is known as the connected isoperimetric problem. The connected edge (vertex) isoperimetric values are defined in a corresponding way. It turns out that the connected edge isoperimetric and the connected vertex isoperimetric values are equal at each i, 1 ≤ i ≤ |V|, if G is a tree. Therefore we use the notation b c (i, T) to denote the connected edge (vertex) isoperimetric value of T at i. Hofstadter had introduced the interesting concept of meta-fibonacci sequences in his famous book “Gödel, Escher, Bach. An Eternal Golden Braid”. The sequence he introduced is known as the Hofstadter sequences and most of the problems he raised regarding this sequence is still open. Since then mathematicians studied many other closely related meta-fibonacci sequences such as Tanny sequences, Conway sequences, Conolly sequences etc. Let T 2 be an infinite complete binary tree. In this paper we related the connected isoperimetric problem on T 2 with the Tanny sequences which is defined by the recurrence relation a(i) = a(i − 1 − a(i − 1)) + a(i − 2 − a(i − 2)), a(0) = a(1) = a(2) = 1. In particular, we show that b c (i, T 2) = i + 2 − 2a(i), for each i ≥ 1. We also propose efficient polynomial time algorithms to find vertex isoperimetric values at i of bounded pathwidth and bounded treewidth graphs.