987 resultados para Random real
Resumo:
Hajnal and Juhasz proved that under CH there is a hereditarily separable, hereditarily normal topological group without non-trivial convergent sequences that is countably compact and not Lindelof. The example constructed is a topological subgroup H subset of 2(omega 1) that is an HFD with the following property (P) the projection of H onto every partial product 2(I) for I is an element of vertical bar omega(1)vertical bar(omega) is onto. Any such group has the necessary properties. We prove that if kappa is a cardinal of uncountable cofinality, then in the model obtained by forcing over a model of CH with the measure algebra on 2(kappa), there is an HFD topological group in 2(omega 1) which has property (P). Crown Copyright (C) 2009 Published by Elsevier B.V. All rights reserved.
Resumo:
Confidence in decision making is an important dimension of managerialbehavior. However, what is the relation between confidence, on the onehand, and the fact of receiving or expecting to receive feedback ondecisions taken, on the other hand? To explore this and related issuesin the context of everyday decision making, use was made of the ESM(Experience Sampling Method) to sample decisions taken by undergraduatesand business executives. For several days, participants received 4 or 5SMS messages daily (on their mobile telephones) at random moments at whichpoint they completed brief questionnaires about their current decisionmaking activities. Issues considered here include differences between thetypes of decisions faced by the two groups, their structure, feedback(received and expected), and confidence in decisions taken as well as inthe validity of feedback. No relation was found between confidence indecisions and whether participants received or expected to receivefeedback on those decisions. In addition, although participants areclearly aware that feedback can provide both confirming and disconfirming evidence, their ability to specify appropriatefeedback is imperfect. Finally, difficulties experienced inusing the ESM are discussed as are possibilities for further researchusing this methodology.
Resumo:
The present research problem is to study the existing encryption methods and to develop a new technique which is performance wise superior to other existing techniques and at the same time can be very well incorporated in the communication channels of Fault Tolerant Hard Real time systems along with existing Error Checking / Error Correcting codes, so that the intention of eaves dropping can be defeated. There are many encryption methods available now. Each method has got it's own merits and demerits. Similarly, many crypt analysis techniques which adversaries use are also available.
Resumo:
"July 1976."
Resumo:
Using the network random generation models from Gustedt (2009)[23], we simulate and analyze several characteristics (such as the number of components, the degree distribution and the clustering coefficient) of the generated networks. This is done for a variety of distributions (fixed value, Bernoulli, Poisson, binomial) that are used to control the parameters of the generation process. These parameters are in particular the size of newly appearing sets of objects, the number of contexts in which new elements appear initially, the number of objects that are shared with `parent` contexts, and, the time period inside which a context may serve as a parent context (aging). The results show that these models allow to fine-tune the generation process such that the graphs adopt properties as can be found in real world graphs. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
In distributed soft real-time systems, maximizing the aggregate quality-of-service (QoS) is a typical system-wide goal, and addressing the problem through distributed optimization is challenging. Subtasks are subject to unpredictable failures in many practical environments, and this makes the problem much harder. In this paper, we present a robust optimization framework for maximizing the aggregate QoS in the presence of random failures. We introduce the notion of K-failure to bound the effect of random failures on schedulability. Using this notion we define the concept of K-robustness that quantifies the degree of robustness on QoS guarantee in a probabilistic sense. The parameter K helps to tradeoff achievable QoS versus robustness. The proposed robust framework produces optimal solutions through distributed computations on the basis of Lagrangian duality, and we present some implementation techniques. Our simulation results show that the proposed framework can probabilistically guarantee sub-optimal QoS which remains feasible even in the presence of random failures.
Resumo:
Currently, due to the widespread use of computers and the internet, students are trading libraries for the World Wide Web and laboratories with simulation programs. In most courses, simulators are made available to students and can be used to proof theoretical results or to test a developing hardware/product. Although this is an interesting solution: low cost, easy and fast way to perform some courses work, it has indeed major disadvantages. As everything is currently being done with/in a computer, the students are loosing the “feel” of the real values of the magnitudes. For instance in engineering studies, and mainly in the first years, students need to learn electronics, algorithmic, mathematics and physics. All of these areas can use numerical analysis software, simulation software or spreadsheets and in the majority of the cases data used is either simulated or random numbers, but real data could be used instead. For example, if a course uses numerical analysis software and needs a dataset, the students can learn to manipulate arrays. Also, when using the spreadsheets to build graphics, instead of using a random table, students could use a real dataset based, for instance, in the room temperature and its variation across the day. In this work we present a framework which uses a simple interface allowing it to be used by different courses where the computers are the teaching/learning process in order to give a more realistic feeling to students by using real data. A framework is proposed based on a set of low cost sensors for different physical magnitudes, e.g. temperature, light, wind speed, which are connected to a central server, that the students have access with an Ethernet protocol or are connected directly to the student computer/laptop. These sensors use the communication ports available such as: serial ports, parallel ports, Ethernet or Universal Serial Bus (USB). Since a central server is used, the students are encouraged to use sensor values results in their different courses and consequently in different types of software such as: numerical analysis tools, spreadsheets or simply inside any programming language when a dataset is needed. In order to do this, small pieces of hardware were developed containing at least one sensor using different types of computer communication. As long as the sensors are attached in a server connected to the internet, these tools can also be shared between different schools. This allows sensors that aren't available in a determined school to be used by getting the values from other places that are sharing them. Another remark is that students in the more advanced years and (theoretically) more know how, can use the courses that have some affinities with electronic development to build new sensor pieces and expand the framework further. The final solution provided is very interesting, low cost, simple to develop, allowing flexibility of resources by using the same materials in several courses bringing real world data into the students computer works.
Resumo:
First published online: December 16, 2014.
Resumo:
The paper investigates the role of real exchange rate misalignment on long-run growth for a set of ninety countries using time series data from 1980 to 2004. We first estimate a panel data model (using fixed and random effects) for the real exchange rate, with different model specifications, in order to produce estimates of the equilibrium real exchange rate and this is then used to construct measures of real exchange rate misalignment. We also provide an alternative set of estimates of real exchange rate misalignment using panel cointegration methods. The variables used in our real exchange rate models are: real per capita GDP; net foreign assets; terms of trade and government consumption. The results for the two-step System GMM panel growth models indicate that the coefficients for real exchange rate misalignment are positive for different model specification and samples, which means that a more depreciated (appreciated) real exchange rate helps (harms) long-run growth. The estimated coefficients are higher for developing and emerging countries.
Resumo:
In this paper I analyze the effects of insider trading on real investmentand the insurance role of financial markets. There is a single entrepreneurwho, at a first stage, chooses the level of investment in a risky business.At the second stage, an asset with random payoff is issued and then the entrepreneurreceives some privileged information on the likely realization of productionreturn. At the third stage, trading occurs on the asset market, where theentrepreneur faces the aggregate demand coming from a continuum of rationaluniformed traders and some noise traders. I compare the equilibrium withinsider trading (when the entrepreneur trades on her inside information in theasset market) with the equilibrium in the same market without insider trading. Ifind that permitting insider trading tends to decrease the level of realinvestment. Moreover, the asset market is thinner and the entrepreneur's netsupply of the asset and the hedge ratio are lower, although the asset priceis more informative and volatile.
Resumo:
In this study, we present a method designed to generate dynamic holograms in holographic optical tweezers. The approach combines our random mask encoding method with iterative high-efficiency algorithms. This hybrid method can be used to dynamically modify precalculated holograms, giving them new functionalities¿temporarily or permanently¿with a low computational cost. This allows the easy addition or removal of a single trap or the independent control of groups of traps for manipulating a variety of rigid structures in real time.
Resumo:
We present a generator of random networks where both the degree-dependent clustering coefficient and the degree distribution are tunable. Following the same philosophy as in the configuration model, the degree distribution and the clustering coefficient for each class of nodes of degree k are fixed ad hoc and a priori. The algorithm generates corresponding topologies by applying first a closure of triangles and second the classical closure of remaining free stubs. The procedure unveils an universal relation among clustering and degree-degree correlations for all networks, where the level of assortativity establishes an upper limit to the level of clustering. Maximum assortativity ensures no restriction on the decay of the clustering coefficient whereas disassortativity sets a stronger constraint on its behavior. Correlation measures in real networks are seen to observe this structural bound.
Resumo:
By generalizing effective-medium theory to the case of orientationally ordered but positionally disordered two component mixtures, it is shown that the anisotropic dielectric tensor of oxide superconductors can be extracted from microwave measurements on oriented crystallites of YBa2Cu3O7¿x embedded in epoxy. Surprisingly, this technique appears to be the only one which can access the resistivity perpendicular to the copper¿oxide planes in crystallites that are too small for depositing electrodes. This possibility arises in part because the real part of the dielectric constant of oxide superconductors has a large magnitude. The validity of the effective-medium approach for orientationally ordered mixtures is corroborated by simulations on two¿dimensional anisotropic random resistor networks. Analysis of the experimental data suggests that the zero-temperature limit of the finite frequency resistivity does not vanish along the c axis, a result which would simply the existence of states at the Fermi surface, even in the superconducting state
Resumo:
The scope of this work is the systematic study of the silicidation process affecting tungsten filaments at high temperature (1900ºC) used for silane decomposition in the hot-wire chemical vapour deposition technique (HWCVD). The correlation between the electrical resistance evolution of the filaments, Rfil(t), and the different stages of the their silicidation process is exposed. Said stages correspond to: the rapid formation of two WSi2 fronts at the cold ends of the filaments and their further propagation towards the middle of the filaments; and, regarding the hot central portion of the filaments: a initial stage of silicon dissolution into the tungsten bulk, with a random duration for as-manufactured filaments, followed by the inhomogeneous nucleation of W5Si3 (which is later replaced by WSi2) and its further growth towards the filaments core. An electrical model is used to obtain real-time information about the current status of the filaments silicidation process by simply monitoring their Rfil(t) evolution during the HWCVD process. It is shown that implementing an annealing pre-treatment to the filaments leads to a clearly repetitive trend in the monitored Rfil(t) signatures. The influence of hydrogen dilution of silane on the filaments silicidation process is also discussed.
Resumo:
We uncover the global organization of clustering in real complex networks. To this end, we ask whether triangles in real networks organize as in maximally random graphs with given degree and clustering distributions, or as in maximally ordered graph models where triangles are forced into modules. The answer comes by way of exploring m-core landscapes, where the m-core is defined, akin to the k-core, as the maximal subgraph with edges participating in at least m triangles. This property defines a set of nested subgraphs that, contrarily to k-cores, is able to distinguish between hierarchical and modular architectures. We find that the clustering organization in real networks is neither completely random nor ordered although, surprisingly, it is more random than modular. This supports the idea that the structure of real networks may in fact be the outcome of self-organized processes based on local optimization rules, in contrast to global optimization principles.