89 resultados para Zigzag edges


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Naturally occurring boundaries between bundles of 90° stripe domains, which form in BaTiO3 lamellae on cooling through the Curie Temperature, have been characterized using both piezoresponse force microscopy (PFM) and scanning transmission electron microscopy (STEM). Detailed interpretation of the dipole configurations present at these boundaries (using data taken from PFM) shows that in the vast majority of cases they are composed of simple zigzag 180° domain walls. Topological information from STEM shows that occasionally domain bundle boundaries can support chains of dipole flux closure and quadrupole nanostructures, but these kinds of boundaries are comparatively rare; when such chains do exist, it is notable that singularities at the cores of the dipole
structures are avoided. The symmetry of the boundary shows that diads and centers of inversion exist at positions where core singularities should have been expected.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Measures of icon designs rely heavily on surveys of the perceptions of population samples. Thus, measuring the extent to which changes in the structure of an icon will alter its perceived complexity can be costly and slow. An automated system capable of producing reliable estimates of perceived complexity could reduce development costs and time. Measures of icon complexity developed by Garcia, Badre, and Stasko (1994) and McDougall, Curry, and de Bruijn (1999) were correlated with six icon properties measured using Matlab (MathWorks, 2001) software, which uses image-processing techniques to measure icon properties. The six icon properties measured were icon foreground, the number of objects in an icon, the number of holes in those objects, and two calculations of icon edges and homogeneity in icon structure. The strongest correlates with human judgments of perceived icon complexity (McDougall et al., 1999) were structural variability (r(s) = .65) and edge information (r(s) =.64).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We review some recent developments in many body perturbation theory (MBPT) calculations that have enabled the study of interfaces and defects. Starting from the theoretical basis of MBPT, Hedin's equations are presented, leading to the CW and CWI' approximations. We introduce the perturbative approach, that is the one most commonly used for obtaining quasiparticle (QP) energies. The practical strategy presented for dealing with the frequency dependence of the self energy operator is based on either plasmon-pole models (PPM) or the contour deformation technique, with the latter being more accurate. We also discuss the extrapolar method for reducing the number of unoccupied states which need to be included explicity in the calculations. The use of the PAW method in the framework of MBPT is also described. Finally, results which have been obtained using, MBPT for band offsets a interfaces and for defects presented, with companies on the main difficulties and cancels.

Schematic representation of the QP corrections (marked with ) to the band edges (E and E-v) and a defect level (F) for a Si/SiO2 interface (Si and O atoms are represented in blue and red, respectively, in the ball and stick model) with an oxygen vacancy leading to a Si-Si bond (the Si atoms involved in this bond are colored light blue).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The quality of single crystal diamond obtained by microwave CVD processes has been drastically improved in the last 5 years thanks to surface pretreatment of the substrates [A. Tallaire, J. Achard, F. Silva, R.S. Sussmann, A. Gicquel, E. Rzepka, Physica Status Solidi (A) 201, 2419-2424 (2004); G. Bogdan, M. Nesladek, J. D'Haen, J. Maes, V.V. Moshchalkov, K. Haenen, M. D'Olieslaeger, Physica Status Solidi (A) 202, 2066-2072 (2005); M. Yamamoto, T. Teraji, T. Ito, Journal of Crystal Growth 285, 130-136 (2005)]. Additionally, recent results have unambiguously shown the occurrence of (110) faces on crystal edges and (113) faces on crystal corners [F. Silva, J. Achard, X. Bonnin, A. Michau, A. Tallaire, O. Brinza, A. Gicquel, Physica Status Solidi (A) 203, 3049-3055 (2006)]. We have developed a 3D geometrical growth model to account for the final crystal morphology. The basic parameters of this growth model are the relative displacement speeds of (111), (110) and (113) faces normalized to that of the (100) faces, respectively alpha, beta, and gamma. This model predicts both the final equilibrium shape of the crystal (i.e. after infinite growth time) and the crystal morphology as a function of alpha, beta, gamma, and deposition time.

An optimized operating point, deduced from the model, has been validated experimentally by measuring the growth rate in (100), (111), (110), and (113) orientations. Furthermore, the evolution of alpha, beta, gamma as a function of methane concentration in the gas discharge has been established. From these results, crystal growth strategies can be proposed in order, for example, to enlarge the deposition area. In particular, we will show, using the growth model, that the only possibility to significantly increase the deposition area is, for our growth conditions, to use a (113) oriented substrate. A comparison between the grown crystal and the model results will be discussed and characterizations of the grown film (Photoluminescence spectroscopy, EPR, SEM) will be presented. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In specific solid-state materials, under the right conditions, collections of magnetic dipoles are known to spontaneously form into a variety of rather complex geometrical patterns, exemplified by vortex and skyrmion structures. While theoretically, similar patterns should be expected to form from electrical dipoles, they have not been clearly observed to date: the need for continued experimental exploration is therefore clear. In this Letter we report the discovery of a rather complex domain arrangement that has spontaneously formed along the edges of a thin single crystal ferroelectric sheet, due to surface-related depolarizing fields. Polarization patterns are such that nanoscale “flux-closure” loops are nested within a larger mesoscale flux closure object. Despite the orders of magnitude differences in size, the geometric forms of the dual-scale flux closure entities are rather similar.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multiple Gaussian pulse interactions and scattering in the nonlinear layered dielectric structures have been examined. The Gaussian pulses with different centre frequencies and lengths are incident at oblique angles on the finite stack of nonlinear dielectric layers. The properties of the reflected and refracted waveforms and the effects of the structure and the incident pulses' parameters on the mixing process are discussed. It is shown that the efficiency of forward emission at the combinatorial frequency can be considerably increased when the wavelengths of interacting pulses are close to the edges of electromagnetic bandgap. © 2012 IEEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We report on Suzaku observations of selected regions within the southern giant lobe of the radio galaxy Centaurus A. In our analysis we focus on distinct X-ray features detected with the X-ray Imaging Spectrometer within the range 0.5-10 keV, some of which are likely associated with fine structure of the lobe revealed by recent high-quality radio intensity and polarization maps. With the available photon statistics, we find that the spectral properties of the detected X-ray features are equally consistent with thermal emission from hot gas with temperatures kT > 1 keV, or with a power-law radiation continuum characterized by photon indices Gamma similar to 2.0 +/- 0.5. However, the plasma parameters implied by these different models favor a synchrotron origin for the analyzed X-ray spots, indicating that a very efficient acceleration of electrons up to greater than or similar to 10 TeV energies is taking place within the giant structure of Centaurus A, albeit only in isolated and compact regions associated with extended and highly polarized radio filaments. We also present a detailed analysis of the diffuse X-ray emission filling the whole field of view of the instrument, resulting in a tentative detection of a soft excess component best fitted by a thermal model with a temperature of kT similar to 0.5 keV. The exact origin of the observed excess remains uncertain, although energetic considerations point to thermal gas filling the bulk of the volume of the lobe and mixed with the non-thermal plasma, rather than to the alternative scenario involving a condensation of the hot intergalactic medium around the edges of the expanding radio structure. If correct, this would be the first detection of the thermal content of the extended lobes of a radio galaxy in X-rays. The corresponding number density of the thermal gas in such a case is n(g) similar to 10(-4) cm(-3), while its pressure appears to be in almost exact equipartition with the volume-averaged non-thermal pressure provided by the radio-emitting electrons and the lobes' magnetic field. A prominent large-scale fluctuation of the Galactic foreground emission, resulting in excess foreground X-ray emission aligned with the lobe, cannot be ruled out. Although tentative, our findings potentially imply that the structure of the extended lobes in active galaxies is likely to be highly inhomogeneous and non-uniform, with magnetic reconnection and turbulent acceleration processes continuously converting magnetic energy to internal energy of the plasma particles, leading to possibly significant spatial and temporal variations in the plasma beta parameter around the volume-averaged equilibrium condition beta similar to 1.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article offers a critical conceptual discussion and refinement of Chomsky’s (2000, 2001, 2007, 2008) phase system, addressing many of the problematic aspects highlighted in the critique of Boeckx & Grohmann (2007) and seeking to resolve these issues, in particular the stipulative and arbitrary properties of phases and phase edges encoded in the (various versions of the) Phase Impenetrability Condition (PIC). Chomsky’s (2000) original conception of phases as lexical subarrays is demonstrated to derive these properties straightforwardly once a single assumption about the pairwise composition of phases is made, and the PIC is reduced to its necessary core under the Strong Minimalist Thesis (SMT)—namely, the provision of an edge. Finally, a comparison is undertaken of the lexical-subarray conception of phases with the feature-inheritance system of Chomsky 2007, 2008, in which phases are simply the locus of uninterpretable features (probes). Both conceptions are argued to conform to the SMT, and both converge on a pairwise composition of phases. However, the two conceptions of phases are argued to be mutually incompatible in numerous fundamental ways, with no current prospect of unification. The lexical-subarray conception of phases is then to be preferred on grounds of greater empirical adequacy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider the problem of self-healing in peer-to-peer networks that are under repeated attack by an omniscient adversary. We assume that, over a sequence of rounds, an adversary either inserts a node with arbitrary connections or deletes an arbitrary node from the network. The network responds to each such change by quick “repairs,” which consist of adding or deleting a small number of edges. These repairs essentially preserve closeness of nodes after adversarial deletions, without increasing node degrees by too much, in the following sense. At any point in the algorithm, nodes v and w whose distance would have been l in the graph formed by considering only the adversarial insertions (not the adversarial deletions), will be at distance at most l log n in the actual graph, where n is the total number of vertices seen so far. Similarly, at any point, a node v whose degree would have been d in the graph with adversarial insertions only, will have degree at most 3d in the actual graph. Our distributed data structure, which we call the Forgiving Graph, has low latency and bandwidth requirements. The Forgiving Graph improves on the Forgiving Tree distributed data structure from Hayes et al. (2008) in the following ways: 1) it ensures low stretch over all pairs of nodes, while the Forgiving Tree only ensures low diameter increase; 2) it handles both node insertions and deletions, while the Forgiving Tree only handles deletions; 3) it requires only a very simple and minimal initialization phase, while the Forgiving Tree initially requires construction of a spanning tree of the network.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The spontaneous oxidation of CO adsorbates on a Pt electrode modified by Ru under open circuit (OC) conditions in perchloric acid solution has been followed, for the first time, using in situ FTIR spectroscopy, and the dynamics of the surface processes taking place have been elucidated. The IR data show that adsorbed CO present on both the Ru and Pt domains and can be oxidized by the oxygen-containing adlayer on the Ru in a chemical process to produce CO under OC conditions. There is a free exchange of CO is between the Ru and Pt sites. Oxidation of CO may take place at the edges of the Ru islands, but CO is transfer, at least on the time scale of these experiments, allows the two different populations to maintain equilibrium. Oxidation is limited in this region by the rate of supply of oxygen to die surface of the catalyst. A mechanism is postulated to explain the observed behavior.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Electing a leader is a fundamental task in distributed computing. In its implicit version, only the leader must know who is the elected leader. This paper focuses on studying the message and time complexity of randomized implicit leader election in synchronous distributed networks. Surprisingly, the most "obvious" complexity bounds have not been proven for randomized algorithms. The "obvious" lower bounds of O(m) messages (m is the number of edges in the network) and O(D) time (D is the network diameter) are non-trivial to show for randomized (Monte Carlo) algorithms. (Recent results that show that even O(n) (n is the number of nodes in the network) is not a lower bound on the messages in complete networks, make the above bounds somewhat less obvious). To the best of our knowledge, these basic lower bounds have not been established even for deterministic algorithms (except for the limited case of comparison algorithms, where it was also required that some nodes may not wake up spontaneously, and that D and n were not known).

We establish these fundamental lower bounds in this paper for the general case, even for randomized Monte Carlo algorithms. Our lower bounds are universal in the sense that they hold for all universal algorithms (such algorithms should work for all graphs), apply to every D, m, and n, and hold even if D, m, and n are known, all the nodes wake up simultaneously, and the algorithms can make anyuse of node's identities. To show that these bounds are tight, we present an O(m) messages algorithm. An O(D) time algorithm is known. A slight adaptation of our lower bound technique gives rise to an O(m) message lower bound for randomized broadcast algorithms.

An interesting fundamental problem is whether both upper bounds (messages and time) can be reached simultaneously in the randomized setting for all graphs. (The answer is known to be negative in the deterministic setting). We answer this problem partially by presenting a randomized algorithm that matches both complexities in some cases. This already separates (for some cases) randomized algorithms from deterministic ones. As first steps towards the general case, we present several universal leader election algorithms with bounds that trade-off messages versus time. We view our results as a step towards understanding the complexity of universal leader election in distributed networks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: There have been few histological or ultrastructural studies of the outer retina and choriocapillaris following panretinal photocoagulation therapy. This investigation examines the long-term morphological effects of panretinal photocoagulation in two patients with type II diabetes who had received laser treatment more than 6 months prior to death.

METHODS: Regions of retina and choroid from each patient were fixed in 2.5% glutaraldehyde, dissected out and examined using light microscopy and scanning and transmission electron microscopy.

RESULTS: After removing the neural retina, scanning electron microscopy of non-photocoagulated areas of the eye cups revealed normal cobblestone-like retinal pigment epithelial (RPE) cells. Regions with laser scars showed little RPE infiltration into the scar area, although large rounded cells often appeared in isolation within these areas. Sections of the retina and choroid in burn regions showed a complete absence of the outer nuclear layer and photoreceptor cells, with the inner retinal layers lying in close apposition to Bruch's membrane. Non-photocoagulated regions of the retina and choroid appeared normal in terms of both cell number and cell distribution. The RPE layer was absent within burn scars but many RPE-like cells appeared markedly hypertrophic at the edges of these regions. Bruch's membrane always remained intact, although the underlying choriocapillaris was clearly disrupted at the point of photocoagulation burns, appearing largely fibrosed and non-perfused. Occasional choroidal capillaries occurring in this region were typically small in profile and had plump non-fenestrated endothelium.

CONCLUSIONS: This study outlines retinal and choroidal cell responses to panretinal photocoagulation in diabetic patients and demonstrates an apparent reduction in the capacity of these tissues to repair laser damage.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In distributed networks, it is often useful for the nodes to be aware of dense subgraphs, e.g., such a dense subgraph could reveal dense substructures in otherwise sparse graphs (e.g. the World Wide Web or social networks); these might reveal community clusters or dense regions for possibly maintaining good communication infrastructure. In this work, we address the problem of self-awareness of nodes in a dynamic network with regards to graph density, i.e., we give distributed algorithms for maintaining dense subgraphs that the member nodes are aware of. The only knowledge that the nodes need is that of the dynamic diameter D, i.e., the maximum number of rounds it takes for a message to traverse the dynamic network. For our work, we consider a model where the number of nodes are fixed, but a powerful adversary can add or remove a limited number of edges from the network at each time step. The communication is by broadcast only and follows the CONGEST model. Our algorithms are continuously executed on the network, and at any time (after some initialization) each node will be aware if it is part (or not) of a particular dense subgraph. We give algorithms that (2 + e)-approximate the densest subgraph and (3 + e)-approximate the at-least-k-densest subgraph (for a given parameter k). Our algorithms work for a wide range of parameter values and run in O(D log n) time. Further, a special case of our results also gives the first fully decentralized approximation algorithms for densest and at-least-k-densest subgraph problems for static distributed graphs. © 2012 Springer-Verlag.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Healing algorithms play a crucial part in distributed peer-to-peer networks where failures occur continuously and frequently. Whereas there are approaches for robustness that rely largely on built-in redundancy, we adopt a responsive approach that is more akin to that of biological networks e.g. the brain. The general goal of self-healing distributed graphs is to maintain certain network properties while recovering from failure quickly and making bounded alterations locally. Several self-healing algorithms have been suggested in the recent literature [IPDPS'08, PODC'08, PODC'09, PODC'11]; they heal various network properties while fulfilling competing requirements such as having low degree increase while maintaining connectivity, expansion and low stretch of the network. In this work, we augment the previous algorithms by adding the notion of edge-preserving self-healing which requires the healing algorithm to not delete any edges originally present or adversarialy inserted. This reflects the cost of adding additional edges but more importantly it immediately follows that edge preservation helps maintain any subgraph induced property that is monotonic, in particular important properties such as graph and subgraph densities. Density is an important network property and in certain distributed networks, maintaining it preserves high connectivity among certain subgraphs and backbones. We introduce a general model of self-healing, and introduce xheal+, an edge-preserving version of xheal[PODC'11]. © 2012 IEEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:



We consider the problem of self-healing in peer-to-peer networks that are under repeated attack by an omniscient adversary. We assume that the following process continues for up to n rounds where n is the total number of nodes initially in the network: the adversary deletesan arbitrary node from the network, then the network responds by quickly adding a small number of new edges.

We present a distributed data structure that ensures two key properties. First, the diameter of the network is never more than O(log Delta) times its original diameter, where Delta is the maximum degree of the network initially. We note that for many peer-to-peer systems, Delta is polylogarithmic, so the diameter increase would be a O(loglog n) multiplicative factor. Second, the degree of any node never increases by more than 3 over its original degree. Our data structure is fully distributed, has O(1) latency per round and requires each node to send and receive O(1) messages per round. The data structure requires an initial setup phase that has latency equal to the diameter of the original network, and requires, with high probability, each node v to send O(log n) messages along every edge incident to v. Our approach is orthogonal and complementary to traditional topology-based approaches to defending against attack.