915 resultados para Hilbert schemes of points Poincaré polynomial Betti numbers Goettsche formula
Resumo:
L’évaluation économique en santé consiste en l’analyse comparative d’alternatives de services en regard à la fois de leurs coûts et de leurs conséquences. Elle est un outil d’aide à la décision. La grande majorité des décisions concernant l’allocation des ressources sont prises en clinique; particulièrement au niveau des soins primaires. Puisque chaque décision est associée à un coût d’opportunité, la non-prise en compte des considérations économiques dans les pratiques des médecins de famille peut avoir un impact important sur l’efficience du système de santé. Il existe peu de connaissances quant à l’influence des évaluations économiques sur la pratique clinique. L’objet de la thèse est de comprendre le rôle de l’évaluation économique dans la pratique des médecins de famille. Ses contributions font l’objet de quatre articles originaux (philosophique, théorique, méthodologique et empirique). L’article philosophique suggère l’importance des questions de complexité et de réflexivité en évaluation économique. La complexité est la perspective philosophique, (approche générale épistémologique) qui sous-tend la thèse. Cette vision du monde met l’attention sur l’explication et la compréhension et sur les relations et les interactions (causalité interactive). Cet accent sur le contexte et le processus de production des données souligne l’importance de la réflexivité dans le processus de recherche. L’article théorique développe une conception nouvelle et différente du problème de recherche. L’originalité de la thèse réside également dans son approche qui s’appuie sur la perspective de la théorie sociologique de Pierre Bourdieu; une approche théorique cohérente avec la complexité. Opposé aux modèles individualistes de l’action rationnelle, Bourdieu préconise une approche sociologique qui s’inscrit dans la recherche d’une compréhension plus complète et plus complexe des phénomènes sociaux en mettant en lumière les influences souvent implicites qui viennent chaque jour exercer des pressions sur les individus et leurs pratiques. L’article méthodologique présente le protocole d’une étude qualitative de cas multiples avec niveaux d’analyse imbriqués : les médecins de famille (niveau micro-individuel) et le champ de la médecine familiale (niveau macro-structurel). Huit études de cas furent réalisées avec le médecin de famille comme unité principale d’analyse. Pour le niveau micro, la collecte des informations fut réalisée à l’aide d’entrevues de type histoire de vie, de documents et d’observation. Pour le niveau macro, la collecte des informations fut réalisée à l’aide de documents, et d’entrevues de type semi-structuré auprès de huit informateurs clés, de neuf organisations médicales. L’induction analytique fut utilisée. L’article empirique présente l’ensemble des résultats empiriques de la thèse. Les résultats montrent une intégration croissante de concepts en économie dans le discours officiel des organisations de médecine familiale. Cependant, au niveau de la pratique, l'économisation de ce discours ne semble pas être une représentation fidèle de la réalité puisque la très grande majorité des participants n'incarnent pas ce discours. Les contributions incluent une compréhension approfondie des processus sociaux qui influencent les schèmes de perception, de pensée, d’appréciation et d’action des médecins de famille quant au rôle de l’évaluation économique dans la pratique clinique et la volonté des médecins de famille à contribuer à une allocation efficiente, équitable et légitime des ressources.
Resumo:
The great potential for the culture of non-penaeid prawns, especially Macrobrachium rosenbergii in brackish and low saline areas of Indian coastal zone has not yet been fully exploited due to the non availability of healthy seed in adequate numbers and that too in the appropriate period. In spite of setting up several prawn hatcheries around the country to satiate the ever growing demands for the seed of the giant fresh water prawn, the supply still remains fear below the requirement mainly due to the mortality of the larvae at different stages of the larval cycle. In a larval rearing system of Macrobrachium rosenbergii, members of the family Vibrionaceae were found to be dominant flora and this was especially pronounced during the times of mortality However, to develop any sort of prophylactic and therapeutic measures, the pathogenic strains have to be segregated from the lot. This would never be possible unless they were clustered based on the principles of numerical taxonomy It is with these objectives and requirements that the present work involving phenotypic characterization of the isolates belonging to the family Vibrionaceae and working out the numerical taxonomy, determination of mole % G+C ratio, segregation of the pathogenic strains and screening antibiotics as therapeutics at times of emergency, was carried out.
Resumo:
In this work, we have mainly achieved the following: 1. we provide a review of the main methods used for the computation of the connection and linearization coefficients between orthogonal polynomials of a continuous variable, moreover using a new approach, the duplication problem of these polynomial families is solved; 2. we review the main methods used for the computation of the connection and linearization coefficients of orthogonal polynomials of a discrete variable, we solve the duplication and linearization problem of all orthogonal polynomials of a discrete variable; 3. we propose a method to generate the connection, linearization and duplication coefficients for q-orthogonal polynomials; 4. we propose a unified method to obtain these coefficients in a generic way for orthogonal polynomials on quadratic and q-quadratic lattices. Our algorithmic approach to compute linearization, connection and duplication coefficients is based on the one used by Koepf and Schmersau and on the NaViMa algorithm. Our main technique is to use explicit formulas for structural identities of classical orthogonal polynomial systems. We find our results by an application of computer algebra. The major algorithmic tools for our development are Zeilberger’s algorithm, q-Zeilberger’s algorithm, the Petkovšek-van-Hoeij algorithm, the q-Petkovšek-van-Hoeij algorithm, and Algorithm 2.2, p. 20 of Koepf's book "Hypergeometric Summation" and it q-analogue.
Resumo:
Compositional data analysis motivated the introduction of a complete Euclidean structure in the simplex of D parts. This was based on the early work of J. Aitchison (1986) and completed recently when Aitchinson distance in the simplex was associated with an inner product and orthonormal bases were identified (Aitchison and others, 2002; Egozcue and others, 2003). A partition of the support of a random variable generates a composition by assigning the probability of each interval to a part of the composition. One can imagine that the partition can be refined and the probability density would represent a kind of continuous composition of probabilities in a simplex of infinitely many parts. This intuitive idea would lead to a Hilbert-space of probability densities by generalizing the Aitchison geometry for compositions in the simplex into the set probability densities
Resumo:
Exam questions and solutions in LaTex. Diagrams for the questions are all together in the support.zip file, as .eps files
Resumo:
Exam questions and solutions in PDF
Resumo:
The response of a uniform horizontal temperature gradient to prescribed fixed heating is calculated in the context of an extended version of surface quasigeostrophic dynamics. It is found that for zero mean surface flow and weak cross-gradient structure the prescribed heating induces a mean temperature anomaly proportional to the spatial Hilbert transform of the heating. The interior potential vorticity generated by the heating enhances this surface response. The time-varying part is independent of the heating and satisfies the usual linearized surface quasigeostrophic dynamics. It is shown that the surface temperature tendency is a spatial Hilbert transform of the temperature anomaly itself. It then follows that the temperature anomaly is periodically modulated with a frequency proportional to the vertical wind shear. A strong local bound on wave energy is also found. Reanalysis diagnostics are presented that indicate consistency with key findings from this theory.
Resumo:
The technique of rapid acidification and alkylation can be used to characterise the redox status of oxidoreductases, and to determine numbers of free cysteine residues within substrate proteins. We have previously used this method to analyse interacting components of the MHC class I pathway, namely ERp57 and tapasin. Here, we have applied rapid acidification alkylation as a novel approach to analysing the redox status of MHC class I molecules. This analysis of the redox status of the MHC class I molecules HLA-A2 and HLA-B27, which is strongly associated with a group of inflammatory arthritic disorders referred to as Spondyloarthropathies, revealed structural and conformational information. We propose that this assay provides a useful tool in the study of in vivo MHC class I structure. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
The concept of an organism's niche is central to ecological theory, but an operational definition is needed that allows both its experimental delineation and interpretation of field distributions of the species. Here we use population growth rate (hereafter, pgr) to de. ne the niche as the set of points in niche space where pgr. 0. If there are just two axes to the niche space, their relationship to pgr can be pictured as a contour map in which pgr varies along the axes in the same way that the height of land above sea level varies with latitude and longitude. In laboratory experiments we measured the pgr of Daphnia magna over a grid of values of pH and Ca2+, and so defined its "laboratory niche'' in pH-Ca2+ space. The position of the laboratory niche boundary suggests that population persistence is only possible above 0.5 mg Ca2+/L and between pH 5.75 and pH 9, though more Ca2+ is needed at lower pH values. To see how well the measured niche predicts the field distribution of D. magna, we examined relevant field data from 422 sites in England and Wales. Of the 58 colonized water bodies, 56 lay within the laboratory niche. Very few of the sites near the niche boundary were colonized, probably because pgr there is so low that populations are vulnerable to extinction by other factors. Our study shows how the niche can be quantified and used to predict field distributions successfully.
Resumo:
Here we describe results which teach us much about the mechanism of the reduction and oxidation of TiO2(110) by the application of scanning tunnelling microscopy imaging at high temperatures. Titania reduces at high temperature by thermal oxygen loss to leave localized (i.e. Ti3+) and delocalized electrons on the lattice Ti, and a reduced titania interstitial that diffuses into the bulk of the crystal. The interstitial titania can be recalled to the surface by treatment in very low pressures of oxygen, occurring at a significant rate even at 573 K. This re-oxidation occurs by re-growth of titania layers in a Volmer-Weber manner, by a repeating sequence in which in-growth of extra titania within the cross-linked (1 x 2) structure completes the (1 x 1) bulk termination. The next layer then initiates with the nucleation of points and strings which extend to form islands of cross-linked (1 x 2), which once again grow and fill in to reform the (1 x 1). This process continues in a cyclical manner to form many new layers of well-ordered titania. The details of the mechanism and kinetics of the process are considered.
Resumo:
Purpose – For many academics in UK universities the nature and orientation of their research is overwhelmingly determined by considerations of how that work will be graded in research assessment exercises (RAEs). The grades awarded to work in a particular subject area can have a considerable impact on the individual and their university. There is a need to better understand those factors which may influence these grades. The paper seeks to address this issue. Design/methodology/approach – The paper considers relationships between the grades awarded and the quantitative information provided to the assessment panels for the 1996 and 2001 RAEs for two subject areas, built environment and town and country planning, and for three other subject areas, civil engineering, geography and archaeology, in the 2001 RAE. Findings – A simple model demonstrating strong and consistent relationships is established. RAE performance relates to numbers of research active staff, the production of books and journal papers, numbers of research studentships and graduations, and research income. Important differences between subject areas are identified. Research limitations/implications – Important issues are raised about the extent to which the new assessment methodology to be adopted for the 2008 RAE will capture the essence of good quality research in architecture and built environment. Originality/value – The findings provide a developmental perspective of RAEs and show how, despite a changed methodology, various research activities might be valued in the 2008 RAE. The basis for a methodology for reviewing the credibility of the judgements of panels is proposed.
Resumo:
Quality control on fruits requires reliable methods, able to assess with reasonable accuracy and possibly in a non-destructive way their physical and chemical characteristics. More specifically, a decreased firmness indicates the presence of damage or defects in the fruit or else that the fruit has exceeded its “best before date”, becoming unsuitable for consumption. In high-value exotic fruits, such as mangoes, where firmness cannot be easily measured from a simple observation of texture, colour changes and unevenness of fruits surface, the use of non-destructive techniques is highly recommendable. In particular, the application of Laser vibrometry, based on the Doppler effect, a non-contact technique sensitive to differences in displacements inferior to the nanometre, appears ideal for a possible on-line control on food. Previous results indicated that a phase shift can be in a repeatable way associated with the presence of damage on the fruit, whilst a decreased firmness results in significant differences in the displacement of the fruits under the same excitation signal. In this work, frequency ranges for quality control via the application of a sound chirp are suggested, based on the measurement of the signal coherence. The variations of the average vibration spectrum of a grid of points, or of point-by-point signal velocity allows the go-no go recognition of “firm” and “over-ripe” fruits, with notable success in the particular case of mangoes. The future exploitation of this work will include the application of this method to allow on-line control during conveyor belt distribution of fruits.
Resumo:
The effect of pH and substrate dose on the fermentation profile of a number of commercial prebiotics was analysed in triplicate using stirred, pH and temperature controlled anaerobic batch culture fermentations, inoculated with a fresh faecal slurry from one of three healthy volunteers. Bacterial numbers were enumerated using fluorescence in situ hybridisation. The commercial prebiotics investigated were fructooligosaccharides (FOS), inulin, galactooligosaccharides (GOS), isomaltooligosaccharides (IMO) and lactulose. Two pH values were investigated, i.e. pH 6 and 6.8. Doses of 1% and 2% (w/v) were investigated, equivalent to approximately 4 and 8 g per day, respectively, in an adult diet. It was found that both pH and dose altered the bacterial composition. It was observed that FOS and inulin demonstrated the greatest bifidogenic effect at pH 6.8 and 1% (w/v) carbohydrate, whereas GOS, IMO and lactulose demonstrated their greatest bifidogenic effect at pH 6 and 2% (w/v) carbohydrate. From this we can conclude that various prebiotics demonstrate differing bifidogenic effects at different conditions in vitro. (C) 2003 Elsevier Science Ltd. All rights reserved.
Resumo:
A method is described for the analysis of deuterated and undeuterated alpha-tocopherol in blood components using liquid chromatography coupled to an orthogonal acceleration time-of-flight (TOF) mass spectrometer. Optimal ionisation conditions for undeuterated (d0) and tri- and hexadeuterated (d3 or d6) alpha-tocopherol standards were found with negative ion mode electrospray ionisation. Each species produced an isotopically resolved single ion of exact mass. Calibration curves of pure standards were linear in the range tested (0-1.5 muM, 0-15 pmol injected). For quantification of d0 and d6 in blood components following a standard solvent extraction, a stable-isotope-labelled internal standard (d3-alpha-tocopherol) was employed. To counter matrix ion suppression effects, standard response curves were generated following identical solvent extraction procedures to those of the samples. Within-day and between-day precision were determined for quantification of d0- and d6-labelled alpha-tocopherol in each blood component and both averaged 3-10%. Accuracy was assessed by comparison with a standard high-performance liquid chromatography (HPLC) method, achieving good correlation (r(2) = 0.94), and by spiking with known concentrations of alpha-tocopherol (98% accuracy). Limits of detection and quantification were determined to be 5 and 50 fmol injected, respectively. The assay was used to measure the appearance and disappearance of deuterium-labelled alpha-tocopherol in human blood components following deuterium-labelled (d6) RRR-alpha-tocopheryl acetate ingestion. The new LC/TOFMS method was found to be sensitive, required small sample volumes, was reproducible and robust, and was capable of high throughput when large numbers of samples were generated. Copyright (C) 2003 John Wiley Sons, Ltd.
Resumo:
This paper describes a method for reconstructing 3D frontier points, contour generators and surfaces of anatomical objects or smooth surfaces from a small number, e. g. 10, of conventional 2D X-ray images. The X-ray images are taken at different viewing directions with full prior knowledge of the X-ray source and sensor configurations. Unlike previous works, we empirically demonstrate that if the viewing directions are uniformly distributed around the object's viewing sphere, then the reconstructed 3D points automatically cluster closely on a highly curved part of the surface and are widely spread on smooth or flat parts. The advantage of this property is that the reconstructed points along a surface or a contour generator are not under-sampled or under-represented because surfaces or contours should be sampled or represented with more densely points where their curvatures are high. The more complex the contour's shape, the greater is the number of points required, but the greater the number of points is automatically generated by the proposed method. Given that the number of viewing directions is fixed and the viewing directions are uniformly distributed, the number and distribution of the reconstructed points depend on the shape or the curvature of the surface regardless of the size of the surface or the size of the object. The technique may be used not only in medicine but also in industrial applications.