939 resultados para Binomial theorem.


Relevância:

10.00% 10.00%

Publicador:

Resumo:

After introducing the no-cloning theorem and the most common forms of approximate quantum cloning, universal quantum cloning is considered in detail. The connections it has with universal NOT-gate, quantum cryptography and state estimation are presented and briefly discussed. The state estimation connection is used to show that the amount of extractable classical information and total Bloch vector length are conserved in universal quantum cloning. The 1  2 qubit cloner is also shown to obey a complementarity relation between local and nonlocal information. These are interpreted to be a consequence of the conservation of total information in cloning. Finally, the performance of the 1  M cloning network discovered by Bužek, Hillery and Knight is studied in the presence of decoherence using the Barenco et al. approach where random phase fluctuations are attached to 2-qubit gates. The expression for average fidelity is calculated for three cases and it is found to depend on the optimal fidelity and the average of the phase fluctuations in a specific way. It is conjectured to be the form of the average fidelity in the general case. While the cloning network is found to be rather robust, it is nevertheless argued that the scalability of the quantum network implementation is poor by studying the effect of decoherence during the preparation of the initial state of the cloning machine in the 1 ! 2 case and observing that the loss in average fidelity can be large. This affirms the result by Maruyama and Knight, who reached the same conclusion in a slightly different manner.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The superconducting gap is a basic character of a superconductor. While the cuprates and conventional phonon-mediated superconductors are characterized by distinct d- and s-wave pairing symmetries with nodal and nodeless gap distributions respectively, the superconducting gap distributions in iron-based superconductors are rather diversified. While nodeless gap distributions have been directly observed in Ba1–xKxFe2As2, BaFe2–xCoxAs2, LiFeAs, KxFe2–ySe2, and FeTe1–xSex, the signatures of a nodal superconducting gap have been reported in LaOFeP, LiFeP, FeSe, KFe2As2, BaFe2–xRuxAs2, and BaFe2(As1–xPx)2. Due to the multiplicity of the Fermi surface in these compounds s± and d pairing states can be both nodeless and nodal. A nontrivial orbital structure of the order parameter, in particular the presence of the gap nodes, leads to effects in which the disorder is much richer in dx2–y2-wave superconductors than in conventional materials. In contrast to the s-wave case, the Anderson theorem does not work, and nonmagnetic impurities exhibit a strong pair-breaking influence. In addition, a finite concentration of disorder produces a nonzero density of quasiparticle states at zero energy, which results in a considerable modification of the thermodynamic and transport properties at low temperatures. The influence of order parameter symmetry on the vortex core structure in iron-based pnictide and chalcogenide superconductors has been investigated in the framework of quasiclassical Eilenberger equations. The main results of the thesis are as follows. The vortex core characteristics, such as, cutoff parameter, ξh, and core size, ξ2, determined as the distance at which density of the vortex supercurrent reaches its maximum, are calculated in wide temperature, impurity scattering rate, and magnetic field ranges. The cutoff parameter, ξh(B; T; Г), determines the form factor of the flux-line lattice, which can be obtained in _SR, NMR, and SANS experiments. A comparison among the applied pairing symmetries is done. In contrast to s-wave systems, in dx2–y2-wave superconductors, ξh/ξc2 always increases with the scattering rate Г. Field dependence of the cutoff parameter affects strongly on the second moment of the magnetic field distributions, resulting in a significant difference with nonlocal London theory. It is found that normalized ξ2/ξc2(B/Bc2) dependence is increasing with pair-breaking impurity scattering (interband scattering for s±-wave and intraband impurity scattering for d-wave superconductors). Here, ξc2 is the Ginzburg-Landau coherence length determined from the upper critical field Bc2 = Φ0/2πξ2 c2, where Φ0 is a flux quantum. Two types of ξ2/ξc2 magnetic field dependences are obtained for s± superconductors. It has a minimum at low temperatures and small impurity scattering transforming in monotonously decreasing function at strong scattering and high temperatures. The second kind of this dependence has been also found for d-wave superconductors at intermediate and high temperatures. In contrast, impurity scattering results in decreasing of ξ2/ξc2(B/Bc2) dependence in s++ superconductors. A reasonable agreement between calculated ξh/ξc2 values and those obtained experimentally in nonstoichiometric BaFe2–xCoxAs2 (μSR) and stoichiometric LiFeAs (SANS) was found. The values of ξh/ξc2 are much less than one in case of the first compound and much more than one for the other compound. This is explained by different influence of two factors: the value of impurity scattering rate and pairing symmetry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this thesis we examine four well-known and traditional concepts of combinatorics on words. However the contexts in which these topics are treated are not the traditional ones. More precisely, the question of avoidability is asked, for example, in terms of k-abelian squares. Two words are said to be k-abelian equivalent if they have the same number of occurrences of each factor up to length k. Consequently, k-abelian equivalence can be seen as a sharpening of abelian equivalence. This fairly new concept is discussed broader than the other topics of this thesis. The second main subject concerns the defect property. The defect theorem is a well-known result for words. We will analyze the property, for example, among the sets of 2-dimensional words, i.e., polyominoes composed of labelled unit squares. From the defect effect we move to equations. We will use a special way to define a product operation for words and then solve a few basic equations over constructed partial semigroup. We will also consider the satisfiability question and the compactness property with respect to this kind of equations. The final topic of the thesis deals with palindromes. Some finite words, including all binary words, are uniquely determined up to word isomorphism by the position and length of some of its palindromic factors. The famous Thue-Morse word has the property that for each positive integer n, there exists a factor which cannot be generated by fewer than n palindromes. We prove that in general, every non ultimately periodic word contains a factor which cannot be generated by fewer than 3 palindromes, and we obtain a classification of those binary words each of whose factors are generated by at most 3 palindromes. Surprisingly these words are related to another much studied set of words, Sturmian words.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of limiting dilution assay (LDA) for assessing the frequency of responders in a cell population is a method extensively used by immunologists. A series of studies addressing the statistical method of choice in an LDA have been published. However, none of these studies has addressed the point of how many wells should be employed in a given assay. The objective of this study was to demonstrate how a researcher can predict the number of wells that should be employed in order to obtain results with a given accuracy, and, therefore, to help in choosing a better experimental design to fulfill one's expectations. We present the rationale underlying the expected relative error computation based on simple binomial distributions. A series of simulated in machina experiments were performed to test the validity of the a priori computation of expected errors, thus confirming the predictions. The step-by-step procedure of the relative error estimation is given. We also discuss the constraints under which an LDA must be performed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The shift towards a knowledge-based economy has inevitably prompted the evolution of patent exploitation. Nowadays, patent is more than just a prevention tool for a company to block its competitors from developing rival technologies, but lies at the very heart of its strategy for value creation and is therefore strategically exploited for economic pro t and competitive advantage. Along with the evolution of patent exploitation, the demand for reliable and systematic patent valuation has also reached an unprecedented level. However, most of the quantitative approaches in use to assess patent could arguably fall into four categories and they are based solely on the conventional discounted cash flow analysis, whose usability and reliability in the context of patent valuation are greatly limited by five practical issues: the market illiquidity, the poor data availability, discriminatory cash-flow estimations, and its incapability to account for changing risk and managerial flexibility. This dissertation attempts to overcome these impeding barriers by rationalizing the use of two techniques, namely fuzzy set theory (aiming at the first three issues) and real option analysis (aiming at the last two). It commences with an investigation into the nature of the uncertainties inherent in patent cash flow estimation and claims that two levels of uncertainties must be properly accounted for. Further investigation reveals that both levels of uncertainties fall under the categorization of subjective uncertainty, which differs from objective uncertainty originating from inherent randomness in that uncertainties labelled as subjective are highly related to the behavioural aspects of decision making and are usually witnessed whenever human judgement, evaluation or reasoning is crucial to the system under consideration and there exists a lack of complete knowledge on its variables. Having clarified their nature, the application of fuzzy set theory in modelling patent-related uncertain quantities is effortlessly justified. The application of real option analysis to patent valuation is prompted by the fact that both patent application process and the subsequent patent exploitation (or commercialization) are subject to a wide range of decisions at multiple successive stages. In other words, both patent applicants and patentees are faced with a large variety of courses of action as to how their patent applications and granted patents can be managed. Since they have the right to run their projects actively, this flexibility has value and thus must be properly accounted for. Accordingly, an explicit identification of the types of managerial flexibility inherent in patent-related decision making problems and in patent valuation, and a discussion on how they could be interpreted in terms of real options are provided in this dissertation. Additionally, the use of the proposed techniques in practical applications is demonstrated by three fuzzy real option analysis based models. In particular, the pay-of method and the extended fuzzy Black-Scholes model are employed to investigate the profitability of a patent application project for a new process for the preparation of a gypsum-fibre composite and to justify the subsequent patent commercialization decision, respectively; a fuzzy binomial model is designed to reveal the economic potential of a patent licensing opportunity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The distribution of psychiatric disorders and of chronic medical illnesses was studied in a population-based sample to determine whether these conditions co-occur in the same individual. A representative sample (N = 1464) of adults living in households was assessed by the Composite International Diagnostic Interview, version 1.1, as part of the São Paulo Epidemiological Catchment Area Study. The association of sociodemographic variables and psychological symptoms regarding medical illness multimorbidity (8 lifetime somatic conditions) and psychiatric multimorbidity (15 lifetime psychiatric disorders) was determined by negative binomial regression. A total of 1785 chronic medical conditions and 1163 psychiatric conditions were detected in the population concentrated in 34.1 and 20% of respondents, respectively. Subjects reporting more psychiatric disorders had more medical illnesses. Characteristics such as age range (35-59 years, risk ratio (RR) = 1.3, and more than 60 years, RR = 1.7), being separated (RR = 1.2), being a student (protective effect, RR = 0.7), being of low educational level (RR = 1.2) and being psychologically distressed (RR = 1.1) were determinants of medical conditions. Age (35-59 years, RR = 1.2, and more than 60 years, RR = 0.5), being retired (RR = 2.5), and being psychologically distressed (females, RR = 1.5, and males, RR = 1.4) were determinants of psychiatric disorders. In conclusion, psychological distress and some sociodemographic features such as age, marital status, occupational status, educational level, and gender are associated with psychiatric and medical multimorbidity. The distribution of both types of morbidity suggests the need of integrating mental health into general clinical settings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hepatitis E virus (HEV) is classified within the family Hepeviridae, genus Hepevirus. HEV genotype 3 (Gt3) infections are endemic in pigs in Western Europe and in North and South America and cause zoonotic infections in humans. Several serological assays to detect HEV antibodies in pigs have been developed, at first mainly based on HEV genotype 1 (Gt1) antigens. To develop a sensitive HEV Gt3 ELISA, a recombinant baculovirus expression product of HEV Gt3 open reading frame-2 was produced and coated onto polystyrene ELISA plates. After incubation of porcine sera, bound HEV antibodies were detected with anti-porcine anti-IgG and anti-IgM conjugates. For primary estimation of sensitivity and specificity of the assay, sets of sera were used from pigs experimentally infected with HEV Gt3. For further validation of the assay and to set the cutoff value, a batch of 1100 pig sera was used. All pig sera were tested using the developed HEV Gt3 assay and two other serologic assays based on HEV Gt1 antigens. Since there is no gold standard available for HEV antibody testing, further validation and a definite setting of the cutoff of the developed HEV Gt3 assay were performed using a statistical approach based on Bayes' theorem. The developed and validated HEV antibody assay showed effective detection of HEV-specific antibodies. This assay can contribute to an improved detection of HEV antibodies and enable more reliable estimates of the prevalence of HEV Gt3 in swine in different regions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aiming at improving the quality of Perna perna mussels cultivated and commercialized in Ubatuba, SP, Brazil, the growth and elimination of Staphylococcus aureus and Bacillus cereus artificially inoculated in mussels were studied. The inoculation was carried out in "in natura" and pre-cooked mussels for 30 min, and after that the mussels were kept for 10 hours at room temperature (25 ± 1 °C) and under refrigeration (7 ± 1 °C). Six thermal treatments were evaluated: three using steam (5, 10 and 15 minutes) and three in boiling water (5, 10 and 15 minutes), in order to find the best time/temperature binomial to provide pathogenic control. Yield and physical-chemical and sensory characteristics were evaluated. All thermal treatments were efficient to eliminate microorganisms in 2 logarithmic cycles. However, the boiling water treatments presented better results than the steam treatments. The physical-chemical and sensory analyses did not show statistical differences among the thermal treatments studied. The best performances were reached in the shortest times of heat exposure. Overall, the treatments in boiling water presented better results than the steam treatments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Resilience is the property of a system to remain trustworthy despite changes. Changes of a different nature, whether due to failures of system components or varying operational conditions, significantly increase the complexity of system development. Therefore, advanced development technologies are required to build robust and flexible system architectures capable of adapting to such changes. Moreover, powerful quantitative techniques are needed to assess the impact of these changes on various system characteristics. Architectural flexibility is achieved by embedding into the system design the mechanisms for identifying changes and reacting on them. Hence a resilient system should have both advanced monitoring and error detection capabilities to recognise changes as well as sophisticated reconfiguration mechanisms to adapt to them. The aim of such reconfiguration is to ensure that the system stays operational, i.e., remains capable of achieving its goals. Design, verification and assessment of the system reconfiguration mechanisms is a challenging and error prone engineering task. In this thesis, we propose and validate a formal framework for development and assessment of resilient systems. Such a framework provides us with the means to specify and verify complex component interactions, model their cooperative behaviour in achieving system goals, and analyse the chosen reconfiguration strategies. Due to the variety of properties to be analysed, such a framework should have an integrated nature. To ensure the system functional correctness, it should rely on formal modelling and verification, while, to assess the impact of changes on such properties as performance and reliability, it should be combined with quantitative analysis. To ensure scalability of the proposed framework, we choose Event-B as the basis for reasoning about functional correctness. Event-B is a statebased formal approach that promotes the correct-by-construction development paradigm and formal verification by theorem proving. Event-B has a mature industrial-strength tool support { the Rodin platform. Proof-based verification as well as the reliance on abstraction and decomposition adopted in Event-B provides the designers with a powerful support for the development of complex systems. Moreover, the top-down system development by refinement allows the developers to explicitly express and verify critical system-level properties. Besides ensuring functional correctness, to achieve resilience we also need to analyse a number of non-functional characteristics, such as reliability and performance. Therefore, in this thesis we also demonstrate how formal development in Event-B can be combined with quantitative analysis. Namely, we experiment with integration of such techniques as probabilistic model checking in PRISM and discrete-event simulation in SimPy with formal development in Event-B. Such an integration allows us to assess how changes and di erent recon guration strategies a ect the overall system resilience. The approach proposed in this thesis is validated by a number of case studies from such areas as robotics, space, healthcare and cloud domain.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

AbstractThis study aimed to evaluate the effect of the distillation time and the sample mass on the total SO2 content in integral passion fruit juice (Passiflora sp). For the SO2 analysis, a modified version of the Monier-Williams method was used. In this experiment, the distillation time and the sample mass were reduced to half of the values proposed in the original method. The analyses were performed in triplicate for each distilling time x sample mass binomial, making a total of 12 tests, which were performed on the same day. The significance of the effects of the different distillation times and sample mass were evaluated by applying one-factor analysis of variance (ANOVA). For a 95% confidence limit, it was found that the proposed amendments to the distillation time, sample mass, and the interaction between distilling time x sample mass were not significant (p > 0.05) in determining the SO2 content in passion fruit juice. In view of the results that were obtained it was concluded that for integral passion fruit juice it was possible to reduce the distillation time and the sample mass in determining the SO2 content by the Monier-Williams method without affecting the result.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objective and originality of this paper lie in identifying Stiglitz's main theoretical contributions to Financial Economics and in briefly portraying the contemporary economic thought out of which these contributions emerged as well as in suggesting their connections with the subsequent economic thought. Grounded on a detailed analysis of Stiglitz's works on finance, his most important theoretical findings are singled out and gathered into four issues: (1) the conditions under which the Modigliani-Miller theorem is valid; (2) the inconsistency inherent to the efficient market hypothesis; (3) the microeconomic effects of asymmetrical information in financial markets; and (4) its real macroeconomic effects. In all of these topics, the focal point of Stiglitz's theoretical research is the unrealistic underpinnings on which the Arrow-Debreu competitive equilibrium model relies. It is also emphasised that this same perspective he coherently followed to construct a fully-fledged theoretical framework would be preserved in his empirical investigations, notably about developing countries, on which he has concentrated effort since the beginnings of the nineties.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we extend Kaldor’s Neo-Pasinetti theorem to the scope of budgetary interventions based on political orientations. First, we take into account a system of taxes and expenditures. Second, we introduce different reaction functions for public spending showing the political role of the State in Cambridge theory of distribution. It turns out that the validity of Kaldorian results depends on the political orientation adopted by government, which diminishes the range of application of the Neo-Pasinetti theorem.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Confocal and two-photon microcopy have become essential tools in biological research and today many investigations are not possible without their help. The valuable advantage that these two techniques offer is the ability of optical sectioning. Optical sectioning makes it possible to obtain 3D visuahzation of the structiu-es, and hence, valuable information of the structural relationships, the geometrical, and the morphological aspects of the specimen. The achievable lateral and axial resolutions by confocal and two-photon microscopy, similar to other optical imaging systems, are both defined by the diffraction theorem. Any aberration and imperfection present during the imaging results in broadening of the calculated theoretical resolution, blurring, geometrical distortions in the acquired images that interfere with the analysis of the structures, and lower the collected fluorescence from the specimen. The aberrations may have different causes and they can be classified by their sources such as specimen-induced aberrations, optics-induced aberrations, illumination aberrations, and misalignment aberrations. This thesis presents an investigation and study of image enhancement. The goal of this thesis was approached in two different directions. Initially, we investigated the sources of the imperfections. We propose methods to eliminate or minimize aberrations introduced during the image acquisition by optimizing the acquisition conditions. The impact on the resolution as a result of using a coverslip the thickness of which is mismatched with the one that the objective lens is designed for was shown and a novel technique was introduced in order to define the proper value on the correction collar of the lens. The amoimt of spherical aberration with regard to t he numerical aperture of the objective lens was investigated and it was shown that, based on the purpose of our imaging tasks, different numerical apertures must be used. The deformed beam cross section of the single-photon excitation source was corrected and the enhancement of the resolution and image quaUty was shown. Furthermore, the dependency of the scattered light on the excitation wavelength was shown empirically. In the second part, we continued the study of the image enhancement process by deconvolution techniques. Although deconvolution algorithms are used widely to improve the quality of the images, how well a deconvolution algorithm responds highly depends on the point spread function (PSF) of the imaging system applied to the algorithm and the level of its accuracy. We investigated approaches that can be done in order to obtain more precise PSF. Novel methods to improve the pattern of the PSF and reduce the noise are proposed. Furthermore, multiple soiu'ces to extract the PSFs of the imaging system are introduced and the empirical deconvolution results by using each of these PSFs are compared together. The results confirm that a greater improvement attained by applying the in situ PSF during the deconvolution process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The frequency dependence of the electron-spin fluctuation spectrum, P(Q), is calculated in the finite bandwidth model. We find that for Pd, which has a nearly full d-band, the magnitude, the range, and the peak frequency of P(Q) are greatly reduced from those in the standard spin fluctuation theory. The electron self-energy due to spin fluctuations is calculated within the finite bandwidth model. Vertex corrections are examined, and we find that Migdal's theorem is valid for spin fluctuations in the nearly full band. The conductance of a normal metal-insulator-normal metal tunnel junction is examined when spin fluctuations are present in one electrode. We find that for the nearly full band, the momentum independent self-energy due to spin fluctuations enters the expression for the tunneling conductance with approximately the same weight as the self-energy due to phonons. The effect of spin fluctuations on the tunneling conductance is slight within the finite bandwidth model for Pd. The effect of spin fluctuations on the tunneling conductance of a metal with a less full d-band than Pd may be more pronounced. However, in this case the tunneling conductance is not simply proportional to the self-energy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this thesis is to price options on equity index futures with an application to standard options on S&P 500 futures traded on the Chicago Mercantile Exchange. Our methodology is based on stochastic dynamic programming, which can accommodate European as well as American options. The model accommodates dividends from the underlying asset. It also captures the optimal exercise strategy and the fair value of the option. This approach is an alternative to available numerical pricing methods such as binomial trees, finite differences, and ad-hoc numerical approximation techniques. Our numerical and empirical investigations demonstrate convergence, robustness, and efficiency. We use this methodology to value exchange-listed options. The European option premiums thus obtained are compared to Black's closed-form formula. They are accurate to four digits. The American option premiums also have a similar level of accuracy compared to premiums obtained using finite differences and binomial trees with a large number of time steps. The proposed model accounts for deterministic, seasonally varying dividend yield. In pricing futures options, we discover that what matters is the sum of the dividend yields over the life of the futures contract and not their distribution.