77 resultados para Set-Valued Functions
Resumo:
Field quantization in unstable optical systems is treated by expanding the vector potential in terms of non-Hermitean (Fox-Li) modes. We define non-Hermitean modes and their adjoints in both the cavity and external regions and make use of the important bi-orthogonality relationships that exist within each mode set. We employ a standard canonical quantization procedure involving the introduction of generalized coordinates and momenta for the electromagnetic (EM) field. Three-dimensional systems are treated, making use of the paraxial and monochromaticity approximations for the cavity non-Hermitean modes. We show that the quantum EM field is equivalent to a set of quantum harmonic oscillators (QHOs), associated with either the cavity or the external region non-Hermitean modes, and thus confirming the validity of the photon model in unstable optical systems. Unlike in the conventional (Hermitean mode) case, the annihilation and creation operators we define for each QHO are not Hermitean adjoints. It is shown that the quantum Hamiltonian for the EM field is the sum of non-commuting cavity and external region contributions, each of which can be expressed as a sum of independent QHO Hamiltonians for each non-Hermitean mode, except that the external field Hamiltonian also includes a coupling term responsible for external non-Hermitean mode photon exchange processes. The non-commutativity of certain cavity and external region annihilation and creation operators is associated with cavity energy gain and loss processes, and may be described in terms of surface integrals involving cavity and external region non-Hermitean mode functions on the cavity-external region boundary. Using the essential states approach and the rotating wave approximation, our results are applied to the spontaneous decay of a two-level atom inside an unstable cavity. We find that atomic transitions leading to cavity non-Hermitean mode photon absorption are associated with a different coupling constant to that for transitions leading to photon emission, a feature consequent on the use of non-Hermitean mode functions. We show that under certain conditions the spontaneous decay rate is enhanced by the Petermann factor.
Resumo:
The numerical implementation of the complex image approach for the Green's function of a mixed-potential integralequation formulation is examined and is found to be limited to low values of k(0) rho (in this context k(0) rho = 2 pirho/ lambda(0), where rho is the distance between the source and the field points of the Green's function and lambda(0) is the free space wavelength). This is a clear limitation for problems of large dimension or high frequency where this limit is easily exceeded. This paper examines the various strategies and proposes a hybrid method whereby most of the above problems can be avoided. An efficient integral method that is valid for large k(0) rho is combined with the complex image method in order to take advantage of the relative merits of both schemes. It is found that a wide overlapping region exists between the two techniques allowing a very efficient and consistent approach for accurately calculating the Green's functions. In this paper, the method developed for the computation of the Green's function is used for planar structures containing both lossless and lossy media.
Resumo:
The neuropathological changes associated with Huntington's disease (HD) are most marked in the head of the caudate nucleus and, to a lesser extent, in the putamen and globus pallidus, suggesting that at least part of the language impairments found in patients with HD may result from non-thalamic subcortical (NTS) pathology. The present study aimed to test the hypothesis that a signature profile of impaired language functions is found in patients who have sustained damage to the non-thalamic subcortex, either focally induced or resulting from neurodegenerative pathology. The language abilities of a group of patients with Huntington's disease (n=13) were compared with those of an age- and education-matched group of patients with chronic NTS lesions following stroke (n=13) and a non-neurologically impaired control group (n=13). The three groups were compared on language tasks that assessed both primary and more complex language abilities. The primary language battery consisted of The Western Aphasia Battery and The Boston Naming Test, whilst the more complex cognitive-linguistic battery employed selected subtests from The Test of Language Competence-Expanded, The Test of Word Knowledge and The Word Test-Revised. On many of the tests of primary language function from the Western Aphasia Battery, both the HD and NTS participants performed in a similar manner to the control participants. The language performances of the HD participants were significantly more impaired (p<0.05 using modified Bonferroni adjustments) than the control group, however, on various lexico-semantic tasks (e. g. the Boston Naming Test and providing definitions), on both single-word and sentence-level generative tasks (e. g. category fluency and formulating sentences), and on tasks which required interpretation of ambiguous, figurative and inferential meaning. The difficulties that patients with HD experienced with tasks assessing complex language abilities were strikingly similar, both qualitatively and quantitatively, to the language profile produced by NTS participants. The results provide evidence to suggest that a signature language profile is associated with damage to the non-thalamic subcortex resulting from either focal neurological insult or a degenerative disease.
Resumo:
This paper characterizes when a Delone set X in R-n is an ideal crystal in terms of restrictions on the number of its local patches of a given size or on the heterogeneity of their distribution. For a Delone set X, let N-X (T) count the number of translation-inequivalent patches of radius T in X and let M-X (T) be the minimum radius such that every closed ball of radius M-X(T) contains the center of a patch of every one of these kinds. We show that for each of these functions there is a gap in the spectrum of possible growth rates between being bounded and having linear growth, and that having sufficiently slow linear growth is equivalent to X being an ideal crystal. Explicitly, for N-X (T), if R is the covering radius of X then either N-X (T) is bounded or N-X (T) greater than or equal to T/2R for all T > 0. The constant 1/2R in this bound is best possible in all dimensions. For M-X(T), either M-X(T) is bounded or M-X(T) greater than or equal to T/3 for all T > 0. Examples show that the constant 1/3 in this bound cannot be replaced by any number exceeding 1/2. We also show that every aperiodic Delone set X has M-X(T) greater than or equal to c(n)T for all T > 0, for a certain constant c(n) which depends on the dimension n of X and is > 1/3 when n > 1.
Resumo:
Existing refinement calculi provide frameworks for the stepwise development of imperative programs from specifications. This paper presents a refinement calculus for deriving logic programs. The calculus contains a wide-spectrum logic programming language, including executable constructs such as sequential conjunction, disjunction, and existential quantification, as well as specification constructs such as general predicates, assumptions and universal quantification. A declarative semantics is defined for this wide-spectrum language based on executions. Executions are partial functions from states to states, where a state is represented as a set of bindings. The semantics is used to define the meaning of programs and specifications, including parameters and recursion. To complete the calculus, a notion of correctness-preserving refinement over programs in the wide-spectrum language is defined and refinement laws for developing programs are introduced. The refinement calculus is illustrated using example derivations and prototype tool support is discussed.
Resumo:
Ab initio calculations have been performed to determine the energetics of oxygen atoms adsorbed onto graphene planes and the possible reaction path extracting carbon atorns in the form of carbon monoxide. Front the energetics it is confirmed that this reaction path will not significantly contribute to the gasification of well ordered carbonaceous chars. Modelling results which explore this limit Lire presented. (C) 2002 Elsevier Science Ltd, All rights reserved.
Resumo:
Objective To study student and staff views of the role and use of handouts, note-taking and overhead transparencies in veterinary science lectures at the University of Queensland Methods The Nominal Group Technique was used to help develop a questionnaire, which was completed by 351 students (a response rate of 84%) and 35 staff (76%) from the 5 years of the veterinary course. The data were analysed using the SAS statistical computer package. Results Staff and students held different views as to the frequency with which handouts should be used, their educational value, and whether they should be complete or partial. Fewer students than staff agreed that handouts discourage further reading in a subject. Almost all staff and students saw the central functions of note-taking to be provision of notes for subsequent revision and encoding information given by the lecturer. More students than staff however, considered that note-taking in lectures interferes with understanding. Staff and students held similar views as to the uses of overheads in lectures. Interestingly however, more staff than students agreed that overheads often contain too much information. Conclusion Both students and staff saw the central role of note-taking as providing a set of good notes for revision. Generally students preferred that this information be provided in the form of partial or complete handouts, while staff preferred students to take notes and to read outside lectures. Surprisingly, more staff than students felt that overhead transparencies often contained too much information. Note-taking, handouts and overhead transparencies need to be linked in a coherent educational strategy to promote effective learning.
Resumo:
We consider a mixture model approach to the regression analysis of competing-risks data. Attention is focused on inference concerning the effects of factors on both the probability of occurrence and the hazard rate conditional on each of the failure types. These two quantities are specified in the mixture model using the logistic model and the proportional hazards model, respectively. We propose a semi-parametric mixture method to estimate the logistic and regression coefficients jointly, whereby the component-baseline hazard functions are completely unspecified. Estimation is based on maximum likelihood on the basis of the full likelihood, implemented via an expectation-conditional maximization (ECM) algorithm. Simulation studies are performed to compare the performance of the proposed semi-parametric method with a fully parametric mixture approach. The results show that when the component-baseline hazard is monotonic increasing, the semi-parametric and fully parametric mixture approaches are comparable for mildly and moderately censored samples. When the component-baseline hazard is not monotonic increasing, the semi-parametric method consistently provides less biased estimates than a fully parametric approach and is comparable in efficiency in the estimation of the parameters for all levels of censoring. The methods are illustrated using a real data set of prostate cancer patients treated with different dosages of the drug diethylstilbestrol. Copyright (C) 2003 John Wiley Sons, Ltd.
Resumo:
Improvement in analysis and reporting results of osteoarthritis (OA) clinical trials has been recently obtained because of harmonization and standardization of the selection of outcome variables (OMERACT 3 and OARSI). Moreover, OARSI has recently proposed the OARSI responder criteria. This composite index permits presentation of results of symptom modifying clinical trials in OA based on individual patient responses (responder yes/no). The 2 organizations (OMERACT and OARSI) established. a task force aimed at evaluating: (1) the variability of observed placebo and active treatment effects using the OARSI responder criteria; and (2) the possibility of proposing a simplified set of criteria. The conclusions of the task force were presented and discussed during the OMERACT 6 conference, where a simplified set of responder criteria (OMERACT-OARSI set of criteria) was proposed.
Resumo:
Let X and Y be Hausdorff topological vector spaces, K a nonempty, closed, and convex subset of X, C: K--> 2(Y) a point-to-set mapping such that for any x is an element of K, C(x) is a pointed, closed, and convex cone in Y and int C(x) not equal 0. Given a mapping g : K --> K and a vector valued bifunction f : K x K - Y, we consider the implicit vector equilibrium problem (IVEP) of finding x* is an element of K such that f (g(x*), y) is not an element of - int C(x) for all y is an element of K. This problem generalizes the (scalar) implicit equilibrium problem and implicit variational inequality problem. We propose the dual of the implicit vector equilibrium problem (DIVEP) and establish the equivalence between (IVEP) and (DIVEP) under certain assumptions. Also, we give characterizations of the set of solutions for (IVP) in case of nonmonotonicity, weak C-pseudomonotonicity, C-pseudomonotonicity, and strict C-pseudomonotonicity, respectively. Under these assumptions, we conclude that the sets of solutions are nonempty, closed, and convex. Finally, we give some applications of (IVEP) to vector variational inequality problems and vector optimization problems. (C) 2003 Elsevier Science Ltd. All rights reserved.
Resumo:
This paper deals with an n-fold Weibull competing risk model. A characterisation of the WPP plot is given along with estimation of model parameters when modelling a given data set. These are illustrated through two examples. A study of the different possible shapes for the density and failure rate functions is also presented. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
Signal peptides and transmembrane helices both contain a stretch of hydrophobic amino acids. This common feature makes it difficult for signal peptide and transmembrane helix predictors to correctly assign identity to stretches of hydrophobic residues near the N-terminal methionine of a protein sequence. The inability to reliably distinguish between N-terminal transmembrane helix and signal peptide is an error with serious consequences for the prediction of protein secretory status or transmembrane topology. In this study, we report a new method for differentiating protein N-terminal signal peptides and transmembrane helices. Based on the sequence features extracted from hydrophobic regions (amino acid frequency, hydrophobicity, and the start position), we set up discriminant functions and examined them on non-redundant datasets with jackknife tests. This method can incorporate other signal peptide prediction methods and achieve higher prediction accuracy. For Gram-negative bacterial proteins, 95.7% of N-terminal signal peptides and transmembrane helices can be correctly predicted (coefficient 0.90). Given a sensitivity of 90%, transmembrane helices can be identified from signal peptides with a precision of 99% (coefficient 0.92). For eukaryotic proteins, 94.2% of N-terminal signal peptides and transmembrane helices can be correctly predicted with coefficient 0.83. Given a sensitivity of 90%, transmembrane helices can be identified from signal peptides with a precision of 87% (coefficient 0.85). The method can be used to complement current transmembrane protein prediction and signal peptide prediction methods to improve their prediction accuracies. (C) 2003 Elsevier Inc. All rights reserved.
Resumo:
Cropp and Gabric [Ecosystem adaptation: do ecosystems maximise resilience? Ecology. In press] used a simple phytoplanktonzooplankton-nutrient model and a genetic algorithm to determine the parameter values that would maximize the value of certain goal functions. These goal functions were to maximize biomass, maximize flux, maximize flux to biomass ratio, and maximize resilience. It was found that maximizing goal functions maximized resilience. The objective of this study was to investigate whether the Cropp and Gabric [Ecosystem adaptation: do ecosystems maximise resilience? Ecology. In press] result was indicative of a general ecosystem principle, or peculiar to the model and parameter ranges used. This study successfully replicated the Cropp and Gabric [Ecosystem adaptation: do ecosystems maximise resilience? Ecology. In press] experiment for a number of different model types, however, a different interpretation of the results is made. A new metric, concordance, was devised to describe the agreement between goal functions. It was found that resilience has the highest concordance of all goal functions trialled. for most model types. This implies that resilience offers a compromise between the established ecological goal functions. The parameter value range used is found to affect the parameter versus goal function relationships. Local maxima and minima affected the relationship between parameters and goal functions, and between goal functions. (C) 2003 Elsevier B.V. All rights reserved.