128 resultados para yields
Resumo:
We present supergravity solutions for 1/8-supersymmetric black supertubes with three charges and three dipoles. Their reduction to five dimensions yields supersymmetric black rings with regular horizons and two independent angular momenta. The general solution contains seven independent parameters and provides the first example of nonuniqueness of supersymmetric black holes. In ten dimensions, the solutions can be realized as D1-D5-P black supertubes. We also present a worldvolume construction of a supertube that exhibits three dipoles explicitly. This description allows an arbitrary cross section but captures only one of the angular momenta.
Resumo:
(2+1)-dimensional anti-de Sitter (AdS) gravity is quantized in the presence of an external scalar field. We find that the coupling between the scalar field and gravity is equivalently described by a perturbed conformal field theory at the boundary of AdS3. This allows us to perform a microscopic computation of the transition rates between black hole states due to absorption and induced emission of the scalar field. Detailed thermodynamic balance then yields Hawking radiation as spontaneous emission, and we find agreement with the semiclassical result, including greybody factors. This result also has application to four and five-dimensional black holes in supergravity.
Resumo:
We recently showed that a heavy quark moving su ciently fast through a quark-gluon plasma may lose energy by Cherenkov-radiating mesons [1]. Here we review our previous holographic calculation of the energy loss in N = 4 Super Yang-Mills and extend it to longitudinal vector mesons and scalar mesons. We also discuss phenomenological implications for heavy-ion collision experiments. Although the Cherenkov energy loss is an O(1=Nc) effect, a ballpark estimate yields a value of dE/dx for Nc = 3 which is comparable to that of other mechanisms.
Resumo:
The ac electrical response is studied in thin films composed of well-defined nanometric Co particles embedded in an insulating ZrO2 matrix which tends to coat them, preventing the formation of aggregates. In the dielectric regime, ac transport originates from the competition between interparticle capacitive Cp and tunneling Rt channels, the latter being thermally assisted. This competition yields an absorption phenomenon at a characteristic frequency 1/(RtCp), which is observed in the range 1010 000 Hz. In this way, the effective ac properties mimic the universal response of disordered dielectric materials. Temperature and frequency determine the complexity and nature of the ac electrical paths, which have been successfully modeled by an Rt-Cp network.
Resumo:
A common way to model multiclass classification problems is by means of Error-Correcting Output Codes (ECOCs). Given a multiclass problem, the ECOC technique designs a code word for each class, where each position of the code identifies the membership of the class for a given binary problem. A classification decision is obtained by assigning the label of the class with the closest code. One of the main requirements of the ECOC design is that the base classifier is capable of splitting each subgroup of classes from each binary problem. However, we cannot guarantee that a linear classifier model convex regions. Furthermore, nonlinear classifiers also fail to manage some type of surfaces. In this paper, we present a novel strategy to model multiclass classification problems using subclass information in the ECOC framework. Complex problems are solved by splitting the original set of classes into subclasses and embedding the binary problems in a problem-dependent ECOC design. Experimental results show that the proposed splitting procedure yields a better performance when the class overlap or the distribution of the training objects conceal the decision boundaries for the base classifier. The results are even more significant when one has a sufficiently large training size.
Resumo:
Con este trabajo revisamos los Modelos de niveles de las tasas de intereses en Chile. Además de los Modelos de Nivel tradicionales por Chan, Karoly, Longstaff y Lijadoras (1992) en EE. UU, y Parisi (1998) en Chile, por el método de Probabilidad Maximun permitimos que la volatilidad condicional también incluya los procesos inesperados de la información (el modelo GARCH ) y también que la volatilidad sea la función del nivel de la tasa de intereses (modelo TVP-NIVELE) como en Brenner, Harjes y la Crona (1996). Para esto usamos producciones de mercado de bonos de reconocimiento, en cambio las producciones mensuales medias de subasta PDBC, y la ampliación del tamaño y la frecuencia de la muestra a 4 producciones semanales con términos(condiciones) diferentes a la madurez: 1 año, 5 años, 10 años y 15 años. Los resultados principales del estudio pueden ser resumidos en esto: la volatilidad de los cambios inesperados de las tarifas depende positivamente del nivel de las tarifas, sobre todo en el modelo de TVP-NIVEL. Obtenemos pruebas de reversión tacañas, tal que los incrementos en las tasas de intereses no eran independientes, contrariamente a lo obtenido por Brenner. en EE. UU. Los modelos de NIVELES no son capaces de ajustar apropiadamente la volatilidad en comparación con un modelo GARCH (1,1), y finalmente, el modelo de TVP-NIVEL no vence los resultados del modelo GARCH (1,1)
Resumo:
We observe dendritic patterns in fluid flow in an anisotropic Hele-Shaw cell and measure the tip shapes and trajectories of individual dendritic branches under conditions where the pattern growth appears to be dominated by surface tension anisotropy and also under conditions where kinetic effects appear dominant. In each case, the tip position depends on a power law in the time, but the exponent of this power law can vary significantly among flow realizations. Averaging many growth exponents a yields a =0.640.09 in the surface tension dominated regime and a =0.660.09 in the kinetic regime. Restricting the analysis to realizations when a is very close to 0.6 shows great regularity across pattern regimes in the coefficient of the temporal dependence of the tip trajectory.
Resumo:
Interfacial hydrodynamic instabilities arise in a range of chemical systems. One mechanism for instability is the occurrence of unstable density gradients due to the accumulation of reaction products. In this paper we conduct two-dimensional nonlinear numerical simulations for a member of this class of system: the methylene-blue¿glucose reaction. The result of these reactions is the oxidation of glucose to a relatively, but marginally, dense product, gluconic acid, that accumulates at oxygen permeable interfaces, such as the surface open to the atmosphere. The reaction is catalyzed by methylene-blue. We show that simulations help to disassemble the mechanisms responsible for the onset of instability and evolution of patterns, and we demonstrate that some of the results are remarkably consistent with experiments. We probe the impact of the upper oxygen boundary condition, for fixed flux, fixed concentration, or mixed boundary conditions, and find significant qualitative differences in solution behavior; structures either attract or repel one another depending on the boundary condition imposed. We suggest that measurement of the form of the boundary condition is possible via observation of oxygen penetration, and improved product yields may be obtained via proper control of boundary conditions in an engineering setting. We also investigate the dependence on parameters such as the Rayleigh number and depth. Finally, we find that pseudo-steady linear and weakly nonlinear techniques described elsewhere are useful tools for predicting the behavior of instabilities beyond their formal range of validity, as good agreement is obtained with the simulations.
Resumo:
This work deals with the elaboration of flood hazard maps. These maps reflect the areas prone to floods based on the effects of Hurricane Mitch in the Municipality of Jucuarán of El Salvador. Stream channels located in the coastal range in the SE of El Salvador flow into the Pacific Ocean and generate alluvial fans. Communities often inhabit these fans can be affected by floods. The geomorphology of these stream basins is associated with small areas, steep slopes, well developed regolite and extensive deforestation. These features play a key role in the generation of flash-floods. This zone lacks comprehensive rainfall data and gauging stations. The most detailed topographic maps are on a scale of 1:25 000. Given that the scale was not sufficiently detailed, we used aerial photographs enlarged to the scale of 1:8000. The effects of Hurricane Mitch mapped on these photographs were regarded as the reference event. Flood maps have a dual purpose (1) community emergency plans, (2) regional land use planning carried out by local authorities. The geomorphological method is based on mapping the geomorphological evidence (alluvial fans, preferential stream channels, erosion and sedimentation, man-made terraces). Following the interpretation of the photographs this information was validated on the field and complemented by eyewitness reports such as the height of water and flow typology. In addition, community workshops were organized to obtain information about the evolution and the impact of the phenomena. The superimposition of this information enables us to obtain a comprehensive geomorphological map. Another aim of the study was the calculation of the peak discharge using the Manning and the paleohydraulic methods and estimates based on geomorphologic criterion. The results were compared with those obtained using the rational method. Significant differences in the order of magnitude of the calculated discharges were noted. The rational method underestimated the results owing to short and discontinuous periods of rainfall data with the result that probabilistic equations cannot be applied. The Manning method yields a wide range of results because of its dependence on the roughness coefficient. The paleohydraulic method yielded higher values than the rational and Manning methods. However, it should be pointed out that it is possible that bigger boulders could have been moved had they existed. These discharge values are lower than those obtained by the geomorphological estimates, i.e. much closer to reality. The flood hazard maps were derived from the comprehensive geomorphological map. Three categories of hazard were established (very high, high and moderate) using flood energy, water height and velocity flow deduced from geomorphological and eyewitness reports.
Resumo:
We develop several results on hitting probabilities of random fields which highlight the role of the dimension of the parameter space. This yields upper and lower bounds in terms of Hausdorff measure and Bessel-Riesz capacity, respectively. We apply these results to a system of stochastic wave equations in spatial dimension k >- 1 driven by a d-dimensional spatially homogeneous additive Gaussian noise that is white in time and colored in space.
Resumo:
We recently showed that a heavy quark moving su ciently fast through a quark-gluon plasma may lose energy by Cherenkov-radiating mesons [1]. Here we review our previous holographic calculation of the energy loss in N = 4 Super Yang-Mills and extend it to longitudinal vector mesons and scalar mesons. We also discuss phenomenological implications for heavy-ion collision experiments. Although the Cherenkov energy loss is an O(1=Nc) effect, a ballpark estimate yields a value of dE/dx for Nc = 3 which is comparable to that of other mechanisms.
Resumo:
The present study proposes a modification in one of the most frequently applied effect size procedures in single-case data analysis the percent of nonoverlapping data. In contrast to other techniques, the calculus and interpretation of this procedure is straightforward and it can be easily complemented by visual inspection of the graphed data. Although the percent of nonoverlapping data has been found to perform reasonably well in N = 1 data, the magnitude of effect estimates it yields can be distorted by trend and autocorrelation. Therefore, the data correction procedure focuses on removing the baseline trend from data prior to estimating the change produced in the behavior due to intervention. A simulation study is carried out in order to compare the original and the modified procedures in several experimental conditions. The results suggest that the new proposal is unaffected by trend and autocorrelation and can be used in case of unstable baselines and sequentially related measurements.
Resumo:
The alternatives used for minimizing the usage of chlorine dioxide in bleaching sequences included a hot acid hydrolysis (Ahot) stage, the use of hot chlorine dioxide (Dhot) and ozone stages at medium consistency and high consistency (Zmc and Zhc), in addition to stages with atmospheric hydrogen peroxide (P) and pressurized hydrogen peroxide (PO). The results were interpreted based on the cost of the chemical products, bleaching process yields and on minimizing the environmental impact of the bleaching process. In spite of some process restrictions, high ISO brightness levels were kept around 90 % brightness. Additionally, the inclusion of stages like acid hydrolysis, pressurized peroxide and ozone in the bleaching sequences provided an increase in operating flexibility, aimed at reducing environmental impact (ECF Light). The Dhot(EOP)D(PO) sequence presented lower operating cost for ISO brightness above 92 %. However, this kind of sequence was not allowed for closing the wastewater circuit, even partially. For ISO brightness level around 91%, the AhotZhcDP sequence presented a lower operating cost than the others
Resumo:
Excitation-continuous music instrument control patterns are often not explicitly represented in current sound synthesis techniques when applied to automatic performance. Both physical model-based and sample-based synthesis paradigmswould benefit from a flexible and accurate instrument control model, enabling the improvement of naturalness and realism. Wepresent a framework for modeling bowing control parameters inviolin performance. Nearly non-intrusive sensing techniques allow for accurate acquisition of relevant timbre-related bowing control parameter signals.We model the temporal contour of bow velocity, bow pressing force, and bow-bridge distance as sequences of short Bézier cubic curve segments. Considering different articulations, dynamics, and performance contexts, a number of note classes are defined. Contours of bowing parameters in a performance database are analyzed at note-level by following a predefined grammar that dictates characteristics of curve segment sequences for each of the classes in consideration. As a result, contour analysis of bowing parameters of each note yields an optimal representation vector that is sufficient for reconstructing original contours with significant fidelity. From the resulting representation vectors, we construct a statistical model based on Gaussian mixtures suitable for both the analysis and synthesis of bowing parameter contours. By using the estimated models, synthetic contours can be generated through a bow planning algorithm able to reproduce possible constraints caused by the finite length of the bow. Rendered contours are successfully used in two preliminary synthesis frameworks: digital waveguide-based bowed stringphysical modeling and sample-based spectral-domain synthesis.
Resumo:
In this paper we propose an endpoint detection system based on the use of several features extracted from each speech frame, followed by a robust classifier (i.e Adaboost and Bagging of decision trees, and a multilayer perceptron) and a finite state automata (FSA). We present results for four different classifiers. The FSA module consisted of a 4-state decision logic that filtered false alarms and false positives. We compare the use of four different classifiers in this task. The look ahead of the method that we propose was of 7 frames, which are the number of frames that maximized the accuracy of the system. The system was tested with real signals recorded inside a car, with signal to noise ratio that ranged from 6 dB to 30dB. Finally we present experimental results demonstrating that the system yields robust endpoint detection.