886 resultados para Gaussian complexities
Resumo:
Background: The cooperative interaction between transcription factors has a decisive role in the control of the fate of the eukaryotic cell. Computational approaches for characterizing cooperative transcription factors in yeast, however, are based on different rationales and provide a low overlap between their results. Because the wealth of information contained in protein interaction networks and regulatory networks has proven highly effective in elucidating functional relationships between proteins, we compared different sets of cooperative transcription factor pairs (predicted by four different computational methods) within the frame of those networks. Results: Our results show that the overlap between the sets of cooperative transcription factors predicted by the different methods is low yet significant. Cooperative transcription factors predicted by all methods are closer and more clustered in the protein interaction network than expected by chance. On the other hand, members of a cooperative transcription factor pair neither seemed to regulate each other nor shared similar regulatory inputs, although they do regulate similar groups of target genes. Conclusion: Despite the different definitions of transcriptional cooperativity and the different computational approaches used to characterize cooperativity between transcription factors, the analysis of their roles in the framework of the protein interaction network and the regulatory network indicates a common denominator for the predictions under study. The knowledge of the shared topological properties of cooperative transcription factor pairs in both networks can be useful not only for designing better prediction methods but also for better understanding the complexities of transcriptional control in eukaryotes.
Resumo:
Expressions relating spectral efficiency, power, and Doppler spectrum, are derived for Rayleigh-faded wireless channels with Gaussian signal transmission. No side information on the state of the channel is assumed at the receiver. Rather, periodic reference signals are postulated in accordance with the functioning of most wireless systems. The analysis relies on a well-established lower bound, generally tight and asymptotically exact at low SNR. In contrast with most previous studies, which relied on block-fading channel models, a continuous-fading model is adopted. This embeds the Doppler spectrum directly in the derived expressions, imbuing them with practical significance. Closed-form relationships are obtained for the popular Clarke-Jakes spectrum and informative expansions, valid for arbitrary spectra, are found for the low- and high-power regimes. While the paper focuses on scalar channels, the extension to multiantenna settings is also discussed.
Resumo:
This paper formulates power allocation policies that maximize the region of mutual informationsachievable in multiuser downlink OFDM channels. Arbitrary partitioning ofthe available tones among users and arbitrary modulation formats, possibly different forevery user, are considered. Two distinct policies are derived, respectively for slow fadingchannels tracked instantaneously by the transmitter and for fast fading channels knownonly statistically thereby. With instantaneous channel tracking, the solution adopts theform of a multiuser mercury/waterfilling procedure that generalizes the single-user mercury/waterfilling introduced in [1, 2]. With only statistical channel information, in contrast,the mercury/waterfilling interpretation is lost. For both policies, a number of limitingregimes are explored and illustrative examples are provided.
Resumo:
PURPOSE: In the radiopharmaceutical therapy approach to the fight against cancer, in particular when it comes to translating laboratory results to the clinical setting, modeling has served as an invaluable tool for guidance and for understanding the processes operating at the cellular level and how these relate to macroscopic observables. Tumor control probability (TCP) is the dosimetric end point quantity of choice which relates to experimental and clinical data: it requires knowledge of individual cellular absorbed doses since it depends on the assessment of the treatment's ability to kill each and every cell. Macroscopic tumors, seen in both clinical and experimental studies, contain too many cells to be modeled individually in Monte Carlo simulation; yet, in particular for low ratios of decays to cells, a cell-based model that does not smooth away statistical considerations associated with low activity is a necessity. The authors present here an adaptation of the simple sphere-based model from which cellular level dosimetry for macroscopic tumors and their end point quantities, such as TCP, may be extrapolated more reliably. METHODS: Ten homogenous spheres representing tumors of different sizes were constructed in GEANT4. The radionuclide 131I was randomly allowed to decay for each model size and for seven different ratios of number of decays to number of cells, N(r): 1000, 500, 200, 100, 50, 20, and 10 decays per cell. The deposited energy was collected in radial bins and divided by the bin mass to obtain the average bin absorbed dose. To simulate a cellular model, the number of cells present in each bin was calculated and an absorbed dose attributed to each cell equal to the bin average absorbed dose with a randomly determined adjustment based on a Gaussian probability distribution with a width equal to the statistical uncertainty consistent with the ratio of decays to cells, i.e., equal to Nr-1/2. From dose volume histograms the surviving fraction of cells, equivalent uniform dose (EUD), and TCP for the different scenarios were calculated. Comparably sized spherical models containing individual spherical cells (15 microm diameter) in hexagonal lattices were constructed, and Monte Carlo simulations were executed for all the same previous scenarios. The dosimetric quantities were calculated and compared to the adjusted simple sphere model results. The model was then applied to the Bortezomib-induced enzyme-targeted radiotherapy (BETR) strategy of targeting Epstein-Barr virus (EBV)-expressing cancers. RESULTS: The TCP values were comparable to within 2% between the adjusted simple sphere and full cellular models. Additionally, models were generated for a nonuniform distribution of activity, and results were compared between the adjusted spherical and cellular models with similar comparability. The TCP values from the experimental macroscopic tumor results were consistent with the experimental observations for BETR-treated 1 g EBV-expressing lymphoma tumors in mice. CONCLUSIONS: The adjusted spherical model presented here provides more accurate TCP values than simple spheres, on par with full cellular Monte Carlo simulations while maintaining the simplicity of the simple sphere model. This model provides a basis for complementing and understanding laboratory and clinical results pertaining to radiopharmaceutical therapy.
Resumo:
Exact closed-form expressions are obtained for the outage probability of maximal ratio combining in η-μ fadingchannels with antenna correlation and co-channel interference. The scenario considered in this work assumes the joint presence of background white Gaussian noise and independent Rayleigh-faded interferers with arbitrary powers. Outage probability results are obtained through an appropriate generalization of the moment-generating function of theη-μ fading distribution, for which new closed-form expressions are provided.
Resumo:
Multiple-input multiple-output (MIMO) techniques have become an essential part of broadband wireless communications systems. For example, the recently developed IEEE 802.16e specifications for broadband wireless access include three MIMOprofiles employing 2×2 space-time codes (STCs), and two of these MIMO schemes are mandatory on the downlink of Mobile WiMAX systems. One of these has full rate, and the other has full diversity, but neither of them has both of the desired features. The third profile, namely, Matrix C, which is not mandatory, is both a full rate and a full diversity code, but it has a high decoder complexity. Recently, the attention was turned to the decodercomplexity issue and including this in the design criteria, several full-rate STCs were proposed as alternatives to Matrix C. In this paper, we review these different alternatives and compare them to Matrix C in terms of performances and the correspondingreceiver complexities.
Resumo:
Business organisations are excellent representations of what in physics and mathematics are designated "chaotic" systems. Because a culture of innovation will be vital for organisational survival in the 21st century, the present paper proposes that viewing organisations in terms of "complexity theory" may assist leaders in fine-tuning managerial philosophies that provide orderly management emphasizing stability within a culture of organised chaos, for it is on the "boundary of chaos" that the greatest creativity occurs. It is argued that 21st century companies, as chaotic social systems, will no longer be effectively managed by rigid objectives (MBO) nor by instructions (MBI). Their capacity for self-organisation will be derived essentially from how their members accept a shared set of values or principles for action (MBV). Complexity theory deals with systems that show complex structures in time or space, often hiding simple deterministic rules. This theory holds that once these rules are found, it is possible to make effective predictions and even to control the apparent complexity. The state of chaos that self-organises, thanks to the appearance of the "strange attractor", is the ideal basis for creativity and innovation in the company. In this self-organised state of chaos, members are not confined to narrow roles, and gradually develop their capacity for differentiation and relationships, growing continuously toward their maximum potential contribution to the efficiency of the organisation. In this way, values act as organisers or "attractors" of disorder, which in the theory of chaos are equations represented by unusually regular geometric configurations that predict the long-term behaviour of complex systems. In business organisations (as in all kinds of social systems) the starting principles end up as the final principles in the long term. An attractor is a model representation of the behavioral results of a system. The attractor is not a force of attraction or a goal-oriented presence in the system; it simply depicts where the system is headed based on its rules of motion. Thus, in a culture that cultivates or shares values of autonomy, responsibility, independence, innovation, creativity, and proaction, the risk of short-term chaos is mitigated by an overall long-term sense of direction. A more suitable approach to manage the internal and external complexities that organisations are currently confronting is to alter their dominant culture under the principles of MBV.
Resumo:
Enterprise-wide architecture has become a necessity for organizations to (re)align information technology (IT) to changing business requirements. Since a city planning metaphor inspired enterprise-wide architecture, this dissertation's research axes can be outlined by similarities between cities and enterprises. Both are characterized as dynamic super-systems that need to address the evolving interest of various architecture stakeholders. Further, both should simultaneously adhere to a set of principles to guide the evolution of architecture towards the expected benefits. The extant literature on enterprise-wide architecture not only disregards architecture adoption's complexities but also remains vague about how principles guide architecture evolution. To bridge this gap, this dissertation contains three interrelated research streams examining the principles and adoption of enterprise-wide architecture. The first research stream investigates organizational intricacies inherent in architecture adoption. It characterizes architecture adoption as an ongoing organizational adaptation process. By analyzing organizational response behaviors in this adaptation process, it also identifies four archetypes that represent very diverse architecture approaches. The second research stream ontologically clarifies the nature of architecture principles along with outlining new avenues for theoretical contributions. This research stream also provides an empirically validated set of principles and proposes a research model illustrating how principles can be applied to generate expected architecture benefits. The third research stream examines architecture adoption in multinational corporations (MNCs). MNCs are Specified by unique organizational characteristics that constantly strive for balancing global integration and local responsiveness. This research stream characterizes MNCs' architecture adoption as a continuous endeavor. This endeavor tries to constantly synchron ize architecture with stakeholders' beliefs about how to balance global integration and local responsiveness. To conclude, this dissertation provides a thorough explanation of a long-term journey in Which organizations learn over time to adopt an effective architecture approach. It also clarifies the role of principles to purposefully guide the aforementioned learning process. - L'Architecture d'Entreprise (AE) est devenue une nécessité pour permettre aux organisations de (ré)aligner les technologies de l'information (TI) avec les changements en termes de besoins métiers. En se basant sur la métaphore de la planification urbaine dont l'AE s'est inspirée, cette dissertation peut être présentée comme une comparaison entre les villes et les entreprises; les deux sont des super-systèmes dynamiques ayant besoin de répondre aux intérêts d'acteurs divers et variés en constants évolution. De plus, les deux devraient souscrire simultanément à un ensemble de principes afin de faire converger l'évolution de l'architecture vers les bénéfices attendus. La littérature sur l'AE, non seulement ne prend pas en considération les complexités de l'adoption d'architecture, mais aussi reste vague sur la manière dont les principes guident l'évolution de l'architecture. Pour pallier ce manque, cette dissertation est composée de trois volets de recherche étroitement liés examinant les principes et l'adoption de l'AE. Le premier volet examine la complexité organisationnelle inhérente à l'adoption de l'architecture. Il caractérise l'adoption de l'architecture en tant que processus d'adaptation continu. En analysant le comportement organisationnel en réponse à ce processus d'adaptation, ce volet distingue quatre archétypes représentant la diversité des approches de l'architecture. Le deuxième volet de recherche clarifie de manière ontologique la nature des principes d'architecture et envisage les contributions théoriques futures possibles. Cet axe de recherche fournit aussi un ensemble de principes, validés de manière empirique, et propose un modèle de recherche illustrant la manière dont ces principes peuvent être appliqués afin de générer les bénéfices attendus de l'architecture. Le troisième volet examine l'adoption de l'architecture dans les entreprises multinationales. Ces dernières possèdent des caractéristiques organisationnelles uniques et sont constamment à la recherche d'un équilibre entre une intégration globale et une flexibilité locale tout en prenant en compte les convictions des divers acteurs sur la manière d'atteindre cet équilibre. Pour conclure, cette dissertation fournit une explication sur le long voyage au cours duquel les entreprises apprennent à adopter une approche d'architecture efficace. Elle clarifie aussi le rôle des principes dans l'accompagnement de ce processus d'apprentissage.
Resumo:
A liderança tem sido um tema central na literatura organizacional sendo definida e operacionalizada de variadíssimas formas, tendo por base funções desempenhadas e/ou comportamentos apresentados, agrupados em diversos conceitos, surgindo muitas vezes sob formas de categorias bipolares apresentadas como contraditórias. Decorrendo da complexidade do ambiente, do desempenho de competências inovadores, do desenvolvimento das sociedades, do desenvolvimento das novas tecnologias, no quadro da globalização que conquista o mundo actual, novas competências são requeridas para conduta das organizações ao contorno da crise e da condução ao sucesso. Surge a necessidade em conceptualizar a liderança tendo por base este novo paradigma. Torna-se necessário utilizar instrumentos fiáveis e adaptados ao contexto cultural e organizacional de cada país. O objectivo deste trabalho é identificar os perfis de liderança existentes em Cabo Verde, identificando ainda os atributos/valores culturais que influenciam as práticas organizacionais e os tipos de comportamentos dos líderes de Cabo Verde. Leadership has been a strong core subject on the organizational literature being defined and operationalized in several ways, and based on functions performed and/or on presented behaviours, gathered according to numerous concepts, and often emerging as bipolar categories presented as contradictory. Due to the environmental complexities, to the innovating skills performance, to the society’s development and the new technologies evolution, in the context of globalization sweeping today’s world, new competencies are required for the organizations to overcome the present crisis situation and consequently achieve success. There’s a need to create a leadership concept based on this new paradigm. It becomes necessary to use trustworthy and adapted tools to each country’s cultural and organizational context. This paper goal is to identify the Capeverdian existing leadership profile, recognizing the cultural attributes/values that may influence the organizational practices and the behaviour types found amongst the Capeverdian leaders.
Resumo:
The vast territories that have been radioactively contaminated during the 1986 Chernobyl accident provide a substantial data set of radioactive monitoring data, which can be used for the verification and testing of the different spatial estimation (prediction) methods involved in risk assessment studies. Using the Chernobyl data set for such a purpose is motivated by its heterogeneous spatial structure (the data are characterized by large-scale correlations, short-scale variability, spotty features, etc.). The present work is concerned with the application of the Bayesian Maximum Entropy (BME) method to estimate the extent and the magnitude of the radioactive soil contamination by 137Cs due to the Chernobyl fallout. The powerful BME method allows rigorous incorporation of a wide variety of knowledge bases into the spatial estimation procedure leading to informative contamination maps. Exact measurements (?hard? data) are combined with secondary information on local uncertainties (treated as ?soft? data) to generate science-based uncertainty assessment of soil contamination estimates at unsampled locations. BME describes uncertainty in terms of the posterior probability distributions generated across space, whereas no assumption about the underlying distribution is made and non-linear estimators are automatically incorporated. Traditional estimation variances based on the assumption of an underlying Gaussian distribution (analogous, e.g., to the kriging variance) can be derived as a special case of the BME uncertainty analysis. The BME estimates obtained using hard and soft data are compared with the BME estimates obtained using only hard data. The comparison involves both the accuracy of the estimation maps using the exact data and the assessment of the associated uncertainty using repeated measurements. Furthermore, a comparison of the spatial estimation accuracy obtained by the two methods was carried out using a validation data set of hard data. Finally, a separate uncertainty analysis was conducted that evaluated the ability of the posterior probabilities to reproduce the distribution of the raw repeated measurements available in certain populated sites. The analysis provides an illustration of the improvement in mapping accuracy obtained by adding soft data to the existing hard data and, in general, demonstrates that the BME method performs well both in terms of estimation accuracy as well as in terms estimation error assessment, which are both useful features for the Chernobyl fallout study.
Resumo:
OBJECTIVES: There are some common occupational agents and exposure circumstances where evidence of carcinogenicity is substantial but not yet conclusive for humans. The objectives are to identify research gaps and needs for twenty agents prioritized for review based on evidence of widespread human exposures and potential carcinogenicity in animals or humans. DATA SOURCES: A systematic review was conducted of new data published since the most recent pertinent IARC monograph meeting. DATA EXTRACTION: Reviewers were charged with identifying data gaps and general and specific approaches to address them, focusing on research that would be important in resolving classification uncertainties. An expert meeting brought reviewers together to discuss each agent and the identified data gaps and approaches. DATA SYNTHESIS: Several overarching issues were identified that pertained to multiple agents; these included the importance of recognizing that carcinogenic agents can act through multiple toxicity pathways and mechanisms, including epigenetic mechanisms, oxidative stress and immuno- and hormonal modulation. CONCLUSIONS: Studies in occupational populations provide important opportunities to understand the mechanisms through which exogenous agents cause cancer and intervene to prevent human exposure and/or prevent or detect cancer among those already exposed. Scientific developments are likely to increase the challenges and complexities of carcinogen testing and evaluation in the future, and epidemiologic studies will be particularly critical to inform carcinogen classification and risk assessment processes.[Authors]
Resumo:
We present simple procedures for the prediction of a real valued sequence. The algorithms are based on a combinationof several simple predictors. We show that if the sequence is a realization of a bounded stationary and ergodic random process then the average of squared errors converges, almost surely, to that of the optimum, given by the Bayes predictor. We offer an analog result for the prediction of stationary gaussian processes.
Resumo:
An affine asset pricing model in which traders have rational but heterogeneous expectations aboutfuture asset prices is developed. We use the framework to analyze the term structure of interestrates and to perform a novel three-way decomposition of bond yields into (i) average expectationsabout short rates (ii) common risk premia and (iii) a speculative component due to heterogeneousexpectations about the resale value of a bond. The speculative term is orthogonal to public informationin real time and therefore statistically distinct from common risk premia. Empirically wefind that the speculative component is quantitatively important accounting for up to a percentagepoint of yields, even in the low yield environment of the last decade. Furthermore, allowing for aspeculative component in bond yields results in estimates of historical risk premia that are morevolatile than suggested by standard Affine Gaussian term structure models which our frameworknests.
Resumo:
This paper presents 3-D brain tissue classificationschemes using three recent promising energy minimizationmethods for Markov random fields: graph cuts, loopybelief propagation and tree-reweighted message passing.The classification is performed using the well knownfinite Gaussian mixture Markov Random Field model.Results from the above methods are compared with widelyused iterative conditional modes algorithm. Theevaluation is performed on a dataset containing simulatedT1-weighted MR brain volumes with varying noise andintensity non-uniformities. The comparisons are performedin terms of energies as well as based on ground truthsegmentations, using various quantitative metrics.
Resumo:
Business organisations are excellent representations of what in physics and mathematics are designated "chaotic" systems. Because a culture of innovation will be vital for organisational survival in the 21st century, the present paper proposes that viewing organisations in terms of "complexity theory" may assist leaders in fine-tuning managerial philosophies that provide orderly management emphasizing stability within a culture of organised chaos, for it is on the "boundary of chaos" that the greatest creativity occurs. It is argued that 21st century companies, as chaotic social systems, will no longer be effectively managed by rigid objectives (MBO) nor by instructions (MBI). Their capacity for self-organisation will be derived essentially from how their members accept a shared set of values or principles for action (MBV). Complexity theory deals with systems that show complex structures in time or space, often hiding simple deterministic rules. This theory holds that once these rules are found, it is possible to make effective predictions and even to control the apparent complexity. The state of chaos that self-organises, thanks to the appearance of the "strange attractor", is the ideal basis for creativity and innovation in the company. In this self-organised state of chaos, members are not confined to narrow roles, and gradually develop their capacity for differentiation and relationships, growing continuously toward their maximum potential contribution to the efficiency of the organisation. In this way, values act as organisers or "attractors" of disorder, which in the theory of chaos are equations represented by unusually regular geometric configurations that predict the long-term behaviour of complex systems. In business organisations (as in all kinds of social systems) the starting principles end up as the final principles in the long term. An attractor is a model representation of the behavioral results of a system. The attractor is not a force of attraction or a goal-oriented presence in the system; it simply depicts where the system is headed based on its rules of motion. Thus, in a culture that cultivates or shares values of autonomy, responsibility, independence, innovation, creativity, and proaction, the risk of short-term chaos is mitigated by an overall long-term sense of direction. A more suitable approach to manage the internal and external complexities that organisations are currently confronting is to alter their dominant culture under the principles of MBV.