880 resultados para one-to-many mapping
Resumo:
Gene expression data can provide a very rich source of information for elucidating the biological function on the pathway level if the experimental design considers the needs of the statistical analysis methods. The purpose of this paper is to provide a comparative analysis of statistical methods for detecting the differentially expression of pathways (DEP). In contrast to many other studies conducted so far, we use three novel simulation types, producing a more realistic correlation structure than previous simulation methods. This includes also the generation of surrogate data from two large-scale microarray experiments from prostate cancer and ALL. As a result from our comprehensive analysis of 41,004 parameter configurations, we find that each method should only be applied if certain conditions of the data from a pathway are met. Further, we provide method-specific estimates for the optimal sample size for microarray experiments aiming to identify DEP in order to avoid an underpowered design. Our study highlights the sensitivity of the studied methods on the parameters of the system. © 2012 Tripahti and Emmert-Streib.
Resumo:
The management of water resources in Ireland prior to the Water Framework Directive (WFD) has focussed on surface water and groundwater as separate entities. A critical element to the successful implementation of the
WFD is to improve our understanding of the interaction between the two and flow mechanisms by which groundwaters discharge to surface waters. An improved understanding of the contribution of groundwater to surface water is required for the classification of groundwater body status and the determination of groundwater quality thresholds. The results of the study will also have a wider application to many areas of the WFD.
A subcommittee of the WFD Groundwater Working Group (GWWG) has been formed to develop a methodology to estimate the groundwater contribution to Irish Rivers. The group has selected a number of analytical techniques to quantify components of stream flow in an Irish context (Master Recession Curve, Unit Hydrograph, Flood Studies Report methodologies and
hydrogeological analytical modelling). The components of stream flow that can be identified include deep groundwater, intermediate and overland. These analyses have been tested on seven pilot catchments that have a variety of hydrogeological settings and have been used to inform and constrain a mathematical model. The mathematical model used was the NAM (NedbØr-AfstrØmnings-Model) rainfall-runoff model which is a module of DHIs MIKE 11 modelling suite. The results from these pilot catchments have been used to develop a decision model based on catchment descriptors from GIS datasets for the selection of NAM parameters. The datasets used include the mapping of aquifers, vulnerability and subsoils, soils, the Digital Terrain Model, CORINE and lakes. The national coverage of the GIS datasets has allowed the extrapolation of the mathematical model to regional catchments across Ireland.
Resumo:
The second harmonic generation (SHG) intensity spectrum of SiC, ZnO, GaN two-dimensional hexagonal crystals is calculated by using a real-time first-principles approach based on Green's function theory [Attaccalite et al., Phys. Rev. B: Condens. Matter Mater. Phys. 2013 88, 235113]. This approach allows one to go beyond the independent particle description used in standard first-principles nonlinear optics calculations by including quasiparticle corrections (by means of the GW approximation), crystal local field effects and excitonic effects. Our results show that the SHG spectra obtained using the latter approach differ significantly from their independent particle counterparts. In particular they show strong excitonic resonances at which the SHG intensity is about two times stronger than within the independent particle approximation. All the systems studied (whose stabilities have been predicted theoretically) are transparent and at the same time exhibit a remarkable SHG intensity in the range of frequencies at which Ti:sapphire and Nd:YAG lasers operate; thus they can be of interest for nanoscale nonlinear frequency conversion devices. Specifically the SHG intensity at 800 nm (1.55 eV) ranges from about 40-80 pm V(-1) in ZnO and GaN to 0.6 nm V(-1) in SiC. The latter value in particular is 1 order of magnitude larger than values in standard nonlinear crystals.
Resumo:
BACKGROUND: Sleep-disordered breathing is a common and serious feature of many paediatric conditions and is particularly a problem in children with Down syndrome. Overnight pulse oximetry is recommended as an initial screening test, but it is unclear how overnight oximetry results should be interpreted and how many nights should be recorded.
METHODS: This retrospective observational study evaluated night-to-night variation using statistical measures of repeatability for 214 children referred to a paediatric respiratory clinic, who required overnight oximetry measurements. This included 30 children with Down syndrome. We measured length of adequate trace, basal SpO2, number of desaturations (>4% SpO2 drop for >10 s) per hour ('adjusted index') and time with SpO2<90%. We classified oximetry traces into normal or abnormal based on physiology.
RESULTS: 132 out of 214 (62%) children had three technically adequate nights' oximetry, including 13 out of 30 (43%) children with Down syndrome. Intraclass correlation coefficient for adjusted index was 0.54 (95% CI 0.20 to 0.81) among children with Down syndrome and 0.88 (95% CI 0.84 to 0.91) for children with other diagnoses. Negative predictor value of a negative first night predicting two subsequent negative nights was 0.2 in children with Down syndrome and 0.55 in children with other diagnoses.
CONCLUSIONS: There is substantial night-to-night variation in overnight oximetry readings among children in all clinical groups undergoing overnight oximetry. This is a more pronounced problem in children with Down syndrome. Increasing the number of attempted nights' recording from one to three provides useful additional clinical information.
Resumo:
Purpose: This paper explores the impact of academic scholarship on the development and practice of experienced managers. Design / Methodology: Semi-structured interviews with experienced managers, modelled on the critical incident technique. ‘Intertextuality’ and framework analysis technique are used to examine whether the use of academic scholarship is a sub-conscious phenomenon. Findings: Experienced managers make little direct use of academic scholarship, using it only occasionally to provide retrospective confirmation of decisions or a technique they can apply. However, academic scholarship informs their practice in an indirect way, their understanding of the ‘gist’ of scholarship comprising one of many sources which they synthesise and evaluate as part of their development process. Practical implications: Managers and management development practitioners should focus upon developing skills of synthesising the ‘gist’ of academic scholarship with other sources of data, rather than upon the detailed remembering, understanding and application of specific scholarship, and upon finding / providing the time and space for that ‘gisting’ and synthesis to take place. Originality / Value: The paper addresses contemporary concerns about the appropriateness of the material delivered on management education programmes for management development. It is original in doing this from the perspective of experienced managers, and in using intertextual analysis to reveal not only the direct but also the indirect uses of they make of such scholarship. The finding of the importance of understanding the ‘gist’ rather than the detail of academic theory represents a key conceptual innovation.
Resumo:
Dissertation presented to obtain a Ph.D. degree in Biochemistry by Instituto de Tecnologia Química e Biológica Universidade Nova de Lisboa.
Resumo:
Heterogeneous multicore platforms are becoming an interesting alternative for embedded computing systems with limited power supply as they can execute specific tasks in an efficient manner. Nonetheless, one of the main challenges of such platforms consists of optimising the energy consumption in the presence of temporal constraints. This paper addresses the problem of task-to-core allocation onto heterogeneous multicore platforms such that the overall energy consumption of the system is minimised. To this end, we propose a two-phase approach that considers both dynamic and leakage energy consumption: (i) the first phase allocates tasks to the cores such that the dynamic energy consumption is reduced; (ii) the second phase refines the allocation performed in the first phase in order to achieve better sleep states by trading off the dynamic energy consumption with the reduction in leakage energy consumption. This hybrid approach considers core frequency set-points, tasks energy consumption and sleep states of the cores to reduce the energy consumption of the system. Major value has been placed on a realistic power model which increases the practical relevance of the proposed approach. Finally, extensive simulations have been carried out to demonstrate the effectiveness of the proposed algorithm. In the best-case, savings up to 18% of energy are reached over the first fit algorithm, which has shown, in previous works, to perform better than other bin-packing heuristics for the target heterogeneous multicore platform.
Resumo:
La théorie de l'information quantique étudie les limites fondamentales qu'imposent les lois de la physique sur les tâches de traitement de données comme la compression et la transmission de données sur un canal bruité. Cette thèse présente des techniques générales permettant de résoudre plusieurs problèmes fondamentaux de la théorie de l'information quantique dans un seul et même cadre. Le théorème central de cette thèse énonce l'existence d'un protocole permettant de transmettre des données quantiques que le receveur connaît déjà partiellement à l'aide d'une seule utilisation d'un canal quantique bruité. Ce théorème a de plus comme corollaires immédiats plusieurs théorèmes centraux de la théorie de l'information quantique. Les chapitres suivants utilisent ce théorème pour prouver l'existence de nouveaux protocoles pour deux autres types de canaux quantiques, soit les canaux de diffusion quantiques et les canaux quantiques avec information supplémentaire fournie au transmetteur. Ces protocoles traitent aussi de la transmission de données quantiques partiellement connues du receveur à l'aide d'une seule utilisation du canal, et ont comme corollaires des versions asymptotiques avec et sans intrication auxiliaire. Les versions asymptotiques avec intrication auxiliaire peuvent, dans les deux cas, être considérées comme des versions quantiques des meilleurs théorèmes de codage connus pour les versions classiques de ces problèmes. Le dernier chapitre traite d'un phénomène purement quantique appelé verrouillage: il est possible d'encoder un message classique dans un état quantique de sorte qu'en lui enlevant un sous-système de taille logarithmique par rapport à sa taille totale, on puisse s'assurer qu'aucune mesure ne puisse avoir de corrélation significative avec le message. Le message se trouve donc «verrouillé» par une clé de taille logarithmique. Cette thèse présente le premier protocole de verrouillage dont le critère de succès est que la distance trace entre la distribution jointe du message et du résultat de la mesure et le produit de leur marginales soit suffisamment petite.
Resumo:
Le but de cette thèse est d étendre la théorie du bootstrap aux modèles de données de panel. Les données de panel s obtiennent en observant plusieurs unités statistiques sur plusieurs périodes de temps. Leur double dimension individuelle et temporelle permet de contrôler l 'hétérogénéité non observable entre individus et entre les périodes de temps et donc de faire des études plus riches que les séries chronologiques ou les données en coupe instantanée. L 'avantage du bootstrap est de permettre d obtenir une inférence plus précise que celle avec la théorie asymptotique classique ou une inférence impossible en cas de paramètre de nuisance. La méthode consiste à tirer des échantillons aléatoires qui ressemblent le plus possible à l échantillon d analyse. L 'objet statitstique d intérêt est estimé sur chacun de ses échantillons aléatoires et on utilise l ensemble des valeurs estimées pour faire de l inférence. Il existe dans la littérature certaines application du bootstrap aux données de panels sans justi cation théorique rigoureuse ou sous de fortes hypothèses. Cette thèse propose une méthode de bootstrap plus appropriée aux données de panels. Les trois chapitres analysent sa validité et son application. Le premier chapitre postule un modèle simple avec un seul paramètre et s 'attaque aux propriétés théoriques de l estimateur de la moyenne. Nous montrons que le double rééchantillonnage que nous proposons et qui tient compte à la fois de la dimension individuelle et la dimension temporelle est valide avec ces modèles. Le rééchantillonnage seulement dans la dimension individuelle n est pas valide en présence d hétérogénéité temporelle. Le ré-échantillonnage dans la dimension temporelle n est pas valide en présence d'hétérogénéité individuelle. Le deuxième chapitre étend le précédent au modèle panel de régression. linéaire. Trois types de régresseurs sont considérés : les caractéristiques individuelles, les caractéristiques temporelles et les régresseurs qui évoluent dans le temps et par individu. En utilisant un modèle à erreurs composées doubles, l'estimateur des moindres carrés ordinaires et la méthode de bootstrap des résidus, on montre que le rééchantillonnage dans la seule dimension individuelle est valide pour l'inférence sur les coe¢ cients associés aux régresseurs qui changent uniquement par individu. Le rééchantillonnage dans la dimen- sion temporelle est valide seulement pour le sous vecteur des paramètres associés aux régresseurs qui évoluent uniquement dans le temps. Le double rééchantillonnage est quand à lui est valide pour faire de l inférence pour tout le vecteur des paramètres. Le troisième chapitre re-examine l exercice de l estimateur de différence en di¤érence de Bertrand, Duflo et Mullainathan (2004). Cet estimateur est couramment utilisé dans la littérature pour évaluer l impact de certaines poli- tiques publiques. L exercice empirique utilise des données de panel provenant du Current Population Survey sur le salaire des femmes dans les 50 états des Etats-Unis d Amérique de 1979 à 1999. Des variables de pseudo-interventions publiques au niveau des états sont générées et on s attend à ce que les tests arrivent à la conclusion qu il n y a pas d e¤et de ces politiques placebos sur le salaire des femmes. Bertrand, Du o et Mullainathan (2004) montre que la non-prise en compte de l hétérogénéité et de la dépendance temporelle entraîne d importantes distorsions de niveau de test lorsqu'on évalue l'impact de politiques publiques en utilisant des données de panel. Une des solutions préconisées est d utiliser la méthode de bootstrap. La méthode de double ré-échantillonnage développée dans cette thèse permet de corriger le problème de niveau de test et donc d'évaluer correctement l'impact des politiques publiques.
Resumo:
Fish and fishery products are having a unique place in global food market due to its unique taste and flavour; moreover, the presence of easily digestible proteins, lipids, vitamins and minerals make it a highly demanded food commodity.Fishery products constitute a major portion of international trade, which is a valuable source of foreign exchange to many developing countries.Several new technologies are emerging to produce various value added products from food; “extrusion technology” is one among them. Food extruder is a better choice for producing a wide variety of high value products at low volume because of its versatility. Extruded products are shelf-stable at ambient temperature. Extrusion cooking is used in the manufacture of food products such as ready-to-eat breakfast cereals, expanded snacks, pasta, fat-bread, soup and drink bases. The raw materialin the form of powder at ambient temperature is fed into extruder at a known feeding rate. The material first gets compacted and then softens and gelatinizes and/or melts to form a plasticized material, which flows downstream into extruder channel and the final quality of the end products depends on the characteristics of starch in the cereals and protein ingredient as affected by extrusion process. The advantages of extrusion process are the process is thermodynamically most efficient, high temperature short time enables destruction of bacteria and anti-nutritional factors, one step cooking process thereby minimizing wastage and destruction of fat hydrolyzing enzymes during extrusion process and enzymes associated with rancidity.
Resumo:
Private governance is currently being evoked as a viable solution to many public policy goals. However, in some circumstances it has shown to produce more harm than good, and even disastrous consequences as in the case of the financial crisis that is raging in most advanced economies. Although the current track record of private regulatory schemes is mixed, policy guidance documents around the world still require that policy-makers give priority to self- and co-regulation, with little or no additional guidance being given to policymakers to devise when, and under what circumstances, these solutions can prove viable from a public policy perspective. With an array of examples from several policy fields, this paper approaches regulation as a public-private collaborative form and attempts to identify possible policy tools to be applied by public policy-makers to efficiently and effectively approach private governance as a solution, rather than a problem. We propose a six-step theoretical framework and argue that IA techniques should: i) define an integrated framework including both the possibility that private regulation can be used as an alternative or as a complement to public legislation; ii) involve private parties in public IAs in order to define the best strategy or strategies that would ensure achievement of the regulatory objectives; and iii) contemplate the deployment of indicators related to governance and activities of the regulators and their ability to coordinate and solve disputes with other regulators.
Resumo:
Purpose: Acquiring details of kinetic parameters of enzymes is crucial to biochemical understanding, drug development, and clinical diagnosis in ocular diseases. The correct design of an experiment is critical to collecting data suitable for analysis, modelling and deriving the correct information. As classical design methods are not targeted to the more complex kinetics being frequently studied, attention is needed to estimate parameters of such models with low variance. Methods: We have developed Bayesian utility functions to minimise kinetic parameter variance involving differentiation of model expressions and matrix inversion. These have been applied to the simple kinetics of the enzymes in the glyoxalase pathway (of importance in posttranslational modification of proteins in cataract), and the complex kinetics of lens aldehyde dehydrogenase (also of relevance to cataract). Results: Our successful application of Bayesian statistics has allowed us to identify a set of rules for designing optimum kinetic experiments iteratively. Most importantly, the distribution of points in the range is critical; it is not simply a matter of even or multiple increases. At least 60 % must be below the KM (or plural if more than one dissociation constant) and 40% above. This choice halves the variance found using a simple even spread across the range.With both the glyoxalase system and lens aldehyde dehydrogenase we have significantly improved the variance of kinetic parameter estimation while reducing the number and costs of experiments. Conclusions: We have developed an optimal and iterative method for selecting features of design such as substrate range, number of measurements and choice of intermediate points. Our novel approach minimises parameter error and costs, and maximises experimental efficiency. It is applicable to many areas of ocular drug design, including receptor-ligand binding and immunoglobulin binding, and should be an important tool in ocular drug discovery.
Resumo:
Understanding the onset of coronal mass ejections (CMEs) is surely one of the holy grails of solar physics today. Inspection of data from the Heliospheric Imagers (HI), which are part of the SECCHI instrument suite aboard the two NASA STEREO spacecraft, appears to have revealed pre-eruption signatures which may provide valuable evidence for identifying the CME onset mechanism. Specifically, an examination of the HI images has revealed narrow rays comprised of a series of outward-propagating plasma blobs apparently forming near the edge of the streamer belt prior to many CME eruptions. In this pilot study, we inspect a limited dataset to explore the significance of this phenomenon, which we have termed a pre-CME ‘fuse’. Although, the enhanced expulsion of blobs may be consistent with an increase in the release of outward-propagating blobs from the streamers themselves, it could also be interpreted as evidence for interchange reconnection in the period leading to a CME onset. Indeed, it is argued that the latter could even have implications for the end-of-life of CMEs. Thus, the presence of these pre-CME fuses provides evidence that the CME onset mechanism is either related to streamer reconnection processes or the reconnection between closed field lines in the streamer belt and adjacent, open field lines. We investigate the nature of these fuses, including their timing and location with respect to CME launch sites, as well as their speed and topology.
Resumo:
In this paper we consider the case of a Bose gas in low dimension in order to illustrate the applicability of a method that allows us to construct analytical relations, valid for a broad range of coupling parameters, for a function which asymptotic expansions are known. The method is well suitable to investigate the problem of stability of a collection of Bose particles trapped in one- dimensional configuration for the case where the scattering length presents a negative value. The eigenvalues for this interacting quantum one-dimensional many particle system become negative when the interactions overcome the trapping energy and, in this case, the system becomes unstable. Here we calculate the critical coupling parameter and apply for the case of Lithium atoms obtaining the critical number of particles for the limit of stability.
Resumo:
This study aimed to explore perceptions and experiences concerning sexuality, contraceptives, unwanted pregnancy and unsafe abortion among young people in Kisumu, Kenya. The design of the study was inductive with a qualitative approach using personal in-depth interviews. Eight participants (four female and four male) were asked to describe their perceptions and experience concerning sexuality, contraceptives, unwanted pregnancies and unsafe abortion. The result showed that culture and norms, misconceptions and gender based power in sexuality are factors that impact Sexual Reproductive Health among young people in Kisumu today. Unwanted pregnancy was described as a shame, a burden and a destroyed life which lead to many unsafely induced abortions. The findings indicate that youth interventions are important, such as engaging young men in unwanted pregnancy and thus unsafe abortions and to empower young women.