128 resultados para radiation problem
Resumo:
The continuous wavelet transform is obtained as a maximumentropy solution of the corresponding inverse problem. It is well knownthat although a signal can be reconstructed from its wavelet transform,the expansion is not unique due to the redundancy of continuous wavelets.Hence, the inverse problem has no unique solution. If we want to recognizeone solution as "optimal", then an appropriate decision criterion hasto be adopted. We show here that the continuous wavelet transform is an"optimal" solution in a maximum entropy sense.
Resumo:
The General Assembly Line Balancing Problem with Setups (GALBPS) was recently defined in the literature. It adds sequence-dependent setup time considerations to the classical Simple Assembly Line Balancing Problem (SALBP) as follows: whenever a task is assigned next to another at the same workstation, a setup time must be added to compute the global workstation time, thereby providing the task sequence inside each workstation. This paper proposes over 50 priority-rule-based heuristic procedures to solve GALBPS, many of which are an improvement upon heuristic procedures published to date.
Resumo:
A maximum entropy statistical treatment of an inverse problem concerning frame theory is presented. The problem arises from the fact that a frame is an overcomplete set of vectors that defines a mapping with no unique inverse. Although any vector in the concomitant space can be expressed as a linear combination of frame elements, the coefficients of the expansion are not unique. Frame theory guarantees the existence of a set of coefficients which is “optimal” in a minimum norm sense. We show here that these coefficients are also “optimal” from a maximum entropy viewpoint.
Resumo:
In this paper we consider a sequential allocation problem with n individuals. The first individual can consume any amount of some endowment leaving the remaining for the second individual, and so on. Motivated by the limitations associated with the cooperative or non-cooperative solutions we propose a new approach. We establish some axioms that should be satisfied, representativeness, impartiality, etc. The result is a unique asymptotic allocation rule. It is shown for n = 2; 3; 4; and a claim is made for general n. We show that it satisfies a set of desirable properties. Key words: Sequential allocation rule, River sharing problem, Cooperative and non-cooperative games, Dictator and ultimatum games. JEL classification: C79, D63, D74.
Resumo:
Despite global environmental governance has traditionally couched global warming in terms of annual CO2 emissions (a flow), global mean temperature is actually determined by cumulative CO2 emissions in the atmosphere (a stock). Thanks to advances of scientific community, nowadays it is possible to quantify the \global carbon budget", that is, the amount of available cumulative CO2 emissions before crossing the 2oC threshold (Meinshausen et al., 2009). The current approach proposes to analyze the allocation of such global carbon budget among countries as a classical conflicting claims problem (O'Neill, 1982). Based on some appealing principles, it is proposed an efficient and sustainable allocation of the available carbon budget from 2000 to 2050 taking into account different environmental risk scenarios. Keywords: Carbon budget, Conflicting claims problem, Distribution, Climate change. JEL classification: C79, D71, D74, H41, H87, Q50, Q54, Q58.
Resumo:
We prove that there are one-parameter families of planar differential equations for which the center problem has a trivial solution and on the other hand the cyclicity of the weak focus is arbitrarily high. We illustrate this phenomenon in several examples for which this cyclicity is computed.
Resumo:
This empirical study consists in an investigation of the effects, on the development of Information Problem Solving (IPS) skills, of a long-term embedded, structured and supported instruction in Secondary Education. Forty secondary students of 7th and 8th grades (13–15 years old) participated in the 2-year IPS instruction designed in this study. Twenty of them participated in the IPS instruction, and the remaining twenty were the control group. All the students were pre- and post-tested in their regular classrooms, and their IPS process and performance were logged by means of screen capture software, to warrant their ecological validity. The IPS constituent skills, the web search sub-skills and the answers given by each participant were analyzed. The main findings of our study suggested that experimental students showed a more expert pattern than the control students regarding the constituent skill ‘defining the problem’ and the following two web search sub-skills: ‘search terms’ typed in a search engine, and ‘selected results’ from a SERP. In addition, scores of task performance were statistically better in experimental students than in control group students. The paper contributes to the discussion of how well-designed and well-embedded scaffolds could be designed in instructional programs in order to guarantee the development and efficiency of the students’ IPS skills by using net information better and participating fully in the global knowledge society.
Resumo:
Random problem distributions have played a key role in the study and design of algorithms for constraint satisfaction and Boolean satisfiability, as well as in ourunderstanding of problem hardness, beyond standard worst-case complexity. We consider random problem distributions from a highly structured problem domain that generalizes the Quasigroup Completion problem (QCP) and Quasigroup with Holes (QWH), a widely used domain that captures the structure underlying a range of real-world applications. Our problem domain is also a generalization of the well-known Sudoku puz- zle: we consider Sudoku instances of arbitrary order, with the additional generalization that the block regions can have rectangular shape, in addition to the standard square shape. We evaluate the computational hardness of Generalized Sudoku instances, for different parameter settings. Our experimental hardness results show that we can generate instances that are considerably harder than QCP/QWH instances of the same size. More interestingly, we show the impact of different balancing strategies on problem hardness. We also provide insights into backbone variables in Generalized Sudoku instances and how they correlate to problem hardness.
Resumo:
The formation and semiclassical evaporation of two-dimensional black holes is studied in an exactly solvable model. Above a certain threshold energy flux, collapsing matter forms a singularity inside an apparent horizon. As the black hole evaporates the apparent horizon recedes and meets the singularity in a finite proper time. The singularity emerges naked, and future evolution of the geometry requires boundary conditions to be imposed there. There is a natural choice of boundary conditions which matches the evaporated black hole solution onto the linear dilaton vacuum. Below the threshold energy flux no horizon forms and boundary conditions can be imposed where infalling matter is reflected from a timelike boundary. All information is recovered at spatial infinity in this case.
Resumo:
In the context of a two-dimensional exactly solvable model, the dynamics of quantum black holes is obtained by analytically continuing the description of the regime where no black hole is formed. The resulting spectrum of outgoing radiation departs from the one predicted by the Hawking model in the region where the outgoing modes arise from the horizon with Planck-order frequencies. This occurs early in the evaporation process, and the resulting physical picture is unconventional. The theory predicts that black holes will only radiate out an energy of Planck mass order, stabilizing after a transitory period. The continuation from a regime without black hole formationaccessible in the 1+1 gravity theory consideredis implicit in an S-matrix approach and suggests in this way a possible solution to the problem of information loss.
Resumo:
A common belief is that further quantum corrections near the singularity of a large black hole should not substantially modify the semiclassical picture of black hole evaporation; in particular, the outgoing spectrum of radiation should be very close to the thermal spectrum predicted by Hawking. In this paper we explore a possible counterexample: in the context of dilaton gravity, we find that nonperturbative quantum corrections which are important in strong-coupling regions may completely alter the semiclassical picture, to the extent that the presumptive spacelike boundary becomes timelike, changing in this way the causal structure of the semiclassical geometry. As a result, only a small fraction of the total energy is radiated outside the fake event horizon; most of the energy comes in fact at later retarded times and there is no problem of information loss. This may constitute a general characteristic of quantum black holes, that is, quantum gravity might be such as to prevent the formation of global event horizons.
Resumo:
Differentiation between photoallergenic and phototoxic reactions induced by low molecular weight compounds represents a current problem. The use of eratinocytes as a potential tool for the detection of photoallergens as opposed to photoirritants is considered an interesting strategy for developing in vitro methods. We have previously demonstrated the possibility to use the human keratinocyte cell line NCTC2455 and the production of interleukin-18 (IL-18) to screen low molecular weight sensitizers. The purpose of this work was to explore the possibility to use the NCTC2544 assay to identify photoallergens and discriminate from phototoxic chemicals. First, we identified suitable condition of UV-irradiation (3.5 J/cm2) by investigating the effect of UVAirradiation on intracellular IL-18 on untreated or chloropromazine (a representative phototoxic compound)- treated NCTC2544 cells. Then, the effect of UVA-irradiation over NCTC2544 cells treated with increasing concentrations of 15 compounds including photoallergens (benzophenone, 4-ter-butyl-4-methoxydibenzoylmethane, 2-ethylexyl-p-methoxycinnamate, ketoprofen, 6-methylcumarin); photoirritant and photoallergen (4-aminobenzoic acid, chlorpromazine, promethazine); photoirritants (acridine, ibuprofen, 8-methoxypsoralen, retinoic acid); and negative compounds (lactic acid, SDS and p-phenilendiamine) was investigated. Twenty-four hours after exposure, cytotoxicity was evaluated by the MTT assay or LDH leakage, while ELISA was used to measure the production of IL-18. At the maximal concentration assayed with non-cytotoxic effects (CV80 under irradiated condition), all tested photoallergens induced a significant and a dose-dependent increase of intracellular IL-18 following UVA irratiation, whereas photoirritants failed. We suggest that this system may be useful for the in vitro evaluation of the photoallergic potential of chemicals.
Resumo:
Alteration and contamination processes modify the chemical composition of ceramic artefacts. This is not restricted solely to the affected elements, but also affects general concentrations. This is due to the compositional nature of chemical data, enclosed by the restriction of unit sum. Since it is impossible to know prior to data treatment whether the original compositions have been changed by such processes, the methodological approach used in provenance studies must be robust enough to handle materials that might have been altered or contaminated. The ability of the logratio transformation proposed by Aitchison to handle compositional data is studied and compared with that of present data treatments. The logaratio transformation appears to offer the most robust approach
Resumo:
Network neutrality is a growing policy controversy. Traffic management techniques affect not only high-speed, high-money content, but by extension all other content too. Internet regulators and users may tolerate much more discrimination in the interests of innovation. For instance, in the absence of regulatory oversight, ISPs could use Deep Packet Inspection (DPI) to block some content altogether, if they decide it is not to the benefit of ISPs, copyright holders, parents or the government. ISP blocking is currently widespread in controlling spam email, and in some countries in blocking sexually graphic illegal images. In 1999 this led to scrutiny of foreclosure of Instant Messaging and video and cable-telephony horizontal merger. Fourteen years later, there were in 2013 net neutrality laws implemented in Slovenia, the Netherlands, Chile and Finland, regulation in the United States and Canada , co-regulation in Norway, and self-regulation in Japan, the United Kingdom and many other European countries . Both Germany and France in mid-2013 debated new net neutrality legislation, and the European Commission announced on 11 September 2013 that it would aim to introduce legislation in early 2014. This paper analyses these legal developments, and in particular the difficulty in assessing reasonable traffic management and ‘specialized’ (i.e. unregulated) faster services in both EU and US law. It also assesses net neutrality law against the international legal norms for user privacy and freedom of expression
Resumo:
A minimum cost spanning tree (mcst) problem analyzes the way to efficiently connect individuals to a source when they are located at different places. Once the efficient tree is obtained, the question on how allocating the total cost among the involved agents defines, in a natural way, a confliicting claims situation. For instance, we may consider the endowment as the total cost of the network, whereas for each individual her claim is the maximum amount she will be allocated, that is, her connection cost to the source. Obviously, we have a confliicting claims problem, so we can apply claims rules in order to obtain an allocation of the total cost. Nevertheless, the allocation obtained by using claims rules might not satisfy some appealing properties (in particular, it does not belong to the core of the associated cooperative game). We will define other natural claims problems that appear if we analyze the maximum and minimum amount that an individual should pay in order to support the minimum cost tree. Keywords: Minimum cost spanning tree problem, Claims problem, Core JEL classification: C71, D63, D71.