63 resultados para heterogeneous nucleation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A nanocomposite consisting of reduced graphene oxide and zinc oxide nanoparticles (RGO/ZnO) with unique structural features was developed as an efficient, sustainable, amphiphilic, heterogeneous catalyst for the synthesis of various 3-substituted indoles in water. The catalyst was recycled six times without significant loss in catalytic activity. The higher environmental compatibility and sustainability factors such as smaller E-factor and higher atom economy make the present methodology a true green and sustainable process for the synthesis of various biologically important 3-substituted indoles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The resolved shear stress is believed to play an important role in twin formation. The present study tests this idea for an extruded magnesium alloy by examining "tension" twinning in different grain orientations. Electron backscatter diffraction analysis is employed for alloy AZ31 tested in compression along the extrusion axis to strains between 0.008 and 0.015. For heavily twinned grains, it is seen that twinning occurs on 2.3 twin systems per grain on average. The active systems are also most commonly those with, or very near to, the highest Schmid factor. The most active system in multiply twinned grains accounts on average for ∼0.6 of the twinning events. In addition, it is found that the twin habit plane falls within 6° of the K1 plane. Orientations with the highest Schmid factors (0.45-0.5) for twinning display twin aspect ratios greater by ∼40% and twin number densities greater by ∼10 times than orientations with maximum Schmid factors for twinning of 0.15-0.2. Thus the Schmid factor for twinning is seen to affect nucleation more than thickening in the present material. Viscoplastic crystal plasticity simulations are employed to obtain approximations for the resolved shear stress. Both the twin aspect ratio and number density correlate quite well with this term. The effect of the former can be assumed to be linear and that of the latter follows a power law with exponent ∼13. Increased aspect ratios and number densities are seen at low Schmid factors and this may relate to stress fluctuations, caused most probably in the present material by the stress fields at the tips of blocked twins. Overall, it is evident that the dominance of twinning on high Schmid factor systems is preserved at the low strains examined in the present work, despite the stress fluctuations known to be present. © 2014 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

 Haptic rendering of complex models is usually prohibitive due to its much higher update rate requirement compared to visual rendering. Previous works have tried to solve this issue by introducing local simulation or multi-rate simulation for the two pipelines. Although these works have improved the capacity of haptic rendering pipeline, they did not take into consideration the situation of heterogeneous objects in one scenario, where rigid objects and deformable objects coexist in one scenario and close to each other. In this paper, we propose a novel idea to support interactive visuo-haptic rendering of complex heterogeneous models. The idea incorporates different collision detection and response algorithms and have them seamlessly switched on and off on the fly, as the HIP travels in the scenario. The selection of rendered models is based on the hypothesis of “parallel universes”, where the transition of rendering one group of models to another is totally transparent to users. To facilitate this idea, we proposed a procedure to convert the traditional single universe scenario into a “multiverse” scenario, where the original models are grouped and split into each parallel universe, depending on the scenario rendering requirement rather than just locality. We also proposed to add simplified visual objects as background avatars in each parallel universe to visually maintain the original scenario while not overly increase the scenario complexity. We tested the proposed idea in a haptically-enabled needle thoracostomy training environment and the result demonstrates that our idea is able to substantially accelerate visuo-haptic rendering with complex heterogeneous scenario objects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study an Fe-18Al (at.%) alloy after various thermal treatments at different times (24-336 h) and temperatures (250-1100 °C) to determine the nature of the so-called 'komplex' phase state (or "K-state"), which is common to other alloy systems having compositions at the boundaries of known order-disorder transitions and is characterised by heterogeneous short-range-ordering (SRO). This has been done by direct observation using atom probe tomography (APT), which reveals that nano-sized, ordered regions/particles do not exist. Also, by employing shell-based analysis of the three-dimensional atomic positions, we have determined chemically sensitive, generalised multicomponent short-range order (GM-SRO) parameters, which are compared with published pairwise SRO parameters derived from bulk, volume-averaged measurement techniques (e.g. X-ray and neutron scattering, Mössbauer spectroscopy) and combined ab-initio and Monte Carlo simulations. This analysis procedure has general relevance for other alloy systems where quantitative chemical-structure evaluation of local atomic environments is required to understand ordering and partial ordering phenomena that affect physical and mechanical properties.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The CADF test of Pesaran (J Appl Econ 22:265–312, 2007) are among the most popular univariate tests for cross-section correlated panels around. Yet, the existing asymptotic analysis of this test statistic is limited to a model in which the errors are assumed to follow a simple AR(1) structure with homogenous autoregressive coefficients. One reason for this is that the model involves an intricate identification issue, as both the serial and cross-section correlation structures of the errors are unobserved. The purpose of the current paper is to tackle this issue and in so doing extend the existing analysis to the case of AR((Formula presented.)) errors with possibly heterogeneous coefficients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To better understand disuse muscle atrophy, via magnetic resonance imaging, we sequentially measured muscle cross-sectional area along the entire length of all individual muscles from the hip to ankle in nine male subjects participating in 60-day head-down tilt bed rest (2nd Berlin BedRest Study; BBR2-2). We hypothesized that individual muscles would not atrophy uniformly along their length such that different regions of an individual muscle would atrophy to different extents. This hypothesis was confirmed for the adductor magnus, vasti, lateral hamstrings, medial hamstrings, rectus femoris, medial gastrocnemius, lateral gastrocnemius, tibialis posterior, flexor hallucis longus, flexor digitorum longus, peroneals, and tibialis anterior muscles (P ≤ 0.004). In contrast, the hypothesis was not confirmed in the soleus, adductor brevis, gracilis, pectineus, and extensor digitorum longus muscles (P ≥ 0.20). The extent of atrophy only weakly correlated (r = -0.30, P < 0.001) with the location of greatest cross-sectional area. The rate of atrophy during bed rest also differed between muscles (P < 0.0001) and between some synergists. Most muscles recovered to their baseline size between 14 and 90 days after bed rest, but flexor hallucis longus, flexor digitorum longus, and lateral gastrocnemius required longer than 90 days before recovery occurred. On the basis of findings of differential atrophy between muscles and evidence in the literature, we interpret our findings of intramuscular atrophy to reflect differential disuse of functionally different muscle regions. The current work represents the first lower-limb wide survey of intramuscular differences in disuse atrophy. We conclude that intramuscular differential atrophy occurs in most, but not all, of the muscles of the lower limb during prolonged bed rest.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

© 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc. Background Therapeutics that target copper for the treatment of prostate cancer are being evaluated in human clinical trials. Elevated intracellular copper is considered to sensitize prostate cancer cells to certain copper-coordination compounds, especially those with ionophoric properties. While there is compelling in vitro evidence that prostate cancer cells accumulate intracellular copper, a corresponding status for copper in patient tissues has not been corroborated. We therefore established whether copper concentrations increase in cancerous prostate tissues, and in sera, in patients throughout disease progression. Methods Human prostate tissue samples were obtained from patient prostatectomies (n=28), and together with patient-matched sera, were analyzed for copper content by inductively coupled plasma mass spectrometry. Results When grouped together, cancerous prostate tissues exhibiting moderate disease severity (Gleason Score 7) (n=10) had 1.6-fold more copper than age-matched normal tissues (n=10) (P<0.05). Those with more aggressive disease (Gleason Score 9) (n=8) had 1.8-fold more copper (P<0.05). In both disease stages however, the copper concentrations between individual samples were rather variable (0.55-3.02μg/g), with many clearly within the normal range (0.52-1.28μg/g). Additionally, we found that there was no change in serum copper concentrations in patients with either moderate or aggressive prostate cancer (Gleason Score 7 or 9), compared with reference intervals and to age-matched controls. Conclusions The heterogeneous nature of copper concentrations in cancerous prostate tissues, suggest that a small subset of patients may respond to treatments that target elevated intratumoral copper. Therefore, such approaches would likely require personalized treatment strategies. Prostate 75:1510-1517, 2015.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In group decision making (GDM) problems, it is natural for decision makers (DMs) to provide different preferences and evaluations owing to varying domain knowledge and cultural values. When the number of DMs is large, a higher degree of heterogeneity is expected, and it is difficult to translate heterogeneous information into one unified preference without loss of context. In this aspect, the current GDM models face two main challenges, i.e., handling the complexity pertaining to the unification of heterogeneous information from a large number of DMs, and providing optimal solutions based on unification methods. This paper presents a new consensus-based GDM model to manage heterogeneous information. In the new GDM model, an aggregation of individual priority (AIP)-based aggregation mechanism, which is able to employ flexible methods for deriving each DM's individual priority and to avoid information loss caused by unifying heterogeneous information, is utilized to aggregate the individual preferences. To reach a consensus more efficiently, different revision schemes are employed to reward/penalize the cooperative/non-cooperative DMs, respectively. The temporary collective opinion used to guide the revision process is derived by aggregating only those non-conflicting opinions at each round of revision. In order to measure the consensus in a robust manner, a position-based dissimilarity measure is developed. Compared with the existing GDM models, the proposed GDM model is more effective and flexible in processing heterogeneous information. It can be used to handle different types of information with different degrees of granularity. Six types of information are exemplified in this paper, i.e., ordinal, interval, fuzzy number, linguistic, intuitionistic fuzzy set, and real number. The results indicate that the position-based consensus measure is able to overcome possible distortions of the results in large-scale GDM problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For multiple heterogeneous multicore server processors across clouds and data centers, the aggregated performance of the cloud of clouds can be optimized by load distribution and balancing. Energy efficiency is one of the most important issues for large-scale server systems in current and future data centers. The multicore processor technology provides new levels of performance and energy efficiency. The present paper aims to develop power and performance constrained load distribution methods for cloud computing in current and future large-scale data centers. In particular, we address the problem of optimal power allocation and load distribution for multiple heterogeneous multicore server processors across clouds and data centers. Our strategy is to formulate optimal power allocation and load distribution for multiple servers in a cloud of clouds as optimization problems, i.e., power constrained performance optimization and performance constrained power optimization. Our research problems in large-scale data centers are well-defined multivariable optimization problems, which explore the power-performance tradeoff by fixing one factor and minimizing the other, from the perspective of optimal load distribution. It is clear that such power and performance optimization is important for a cloud computing provider to efficiently utilize all the available resources. We model a multicore server processor as a queuing system with multiple servers. Our optimization problems are solved for two different models of core speed, where one model assumes that a core runs at zero speed when it is idle, and the other model assumes that a core runs at a constant speed. Our results in this paper provide new theoretical insights into power management and performance optimization in data centers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Robots are ever increasing in a variety of different workplaces providing an array of benefits such alternative solutions to traditional human labor. While developing fully autonomous robots is the ultimate goal in many robotic applications the reality is that there still exist many situationswere robots require some level of teleoperation in order to achieve assigned goals especially when deployed in non-deterministic environments. For instance teleoperation is commonly used in areas such as search and rescue, bomb disposal and exploration of inaccessible or harsh terrain. This is due to a range of factors such as the lack of ability for robots to quickly and reliably navigate unknown environments or provide high-level decision making especially intime critical tasks. To provide an adequate solution for such situations human-in-the-loop control is required. When developing human-in-the-loop control it is important to take advantage of the complimentary skill-sets that both humans and robots share. For example robots can performrapid calculations, provide accurate measurements through hardware such as sensors and store large amounts of data while humans provide experience, intuition, risk management and complex decision making capabilities. Shared autonomy is the concept of building robotic systems that take advantage of these complementary skills-sets to provide a robust an efficient robotic solution. While the requirement of human-in-the-loop control exists Human Machine Interaction (HMI) remains an important research topic especially the area of User Interface (UI) design.In order to provide operators with an effective teleoperation system it is important that the interface is intuitive and dynamic while also achieving a high level of immersion. Recent advancements in virtual and augmented reality hardware is giving rise to innovative HMI systems. Interactive hardware such as Microsoft Kinect, leap motion, Oculus Rift, Samsung Gear VR and even CAVE Automatic Virtual Environments [1] are providing vast improvements over traditional user interface designs such as the experimental web browser JanusVR [2]. This combined with the introduction of standardized robot frameworks such as ROS and Webots [3] that now support a large number of different robots provides an opportunity to develop a universal UI for teleoperation control to improve operator efficiency while reducing teleoperation training.This research introduces the concept of a dynamic virtual workspace for teleoperation of heterogeneous robots in non-deterministic environments that require human-in-the-loop control. The system first identifies the connected robots through the use kinematic information then determines its network capabilities such as latency and bandwidth. Given the robot type and network capabilities the system can then provide the operator with available teleoperation modes such as pick and place control or waypoint navigation while also allowing them to manipulate the virtual workspace layout to provide information from onboard camera’s or sensors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hierarchical Dirichlet processes (HDP) was originally designed and experimented for a single data channel. In this paper we enhanced its ability to model heterogeneous data using a richer structure for the base measure being a product-space. The enhanced model, called Product Space HDP (PS-HDP), can (1) simultaneously model heterogeneous data from multiple sources in a Bayesian nonparametric framework and (2) discover multilevel latent structures from data to result in different types of topics/latent structures that can be explained jointly. We experimented with the MDC dataset, a large and real-world data collected from mobile phones. Our goal was to discover identity–location– time (a.k.a who-where-when) patterns at different levels (globally for all groups and locally for each group). We provided analysis on the activities and patterns learned from our model, visualized, compared and contrasted with the ground-truth to demonstrate the merit of the proposed framework. We further quantitatively evaluated and reported its performance using standard metrics including F1-score, NMI, RI, and purity. We also compared the performance of the PS-HDP model with those of popular existing clustering methods (including K-Means, NNMF, GMM, DP-Means, and AP). Lastly, we demonstrate the ability of the model in learning activities with missing data, a common problem encountered in pervasive and ubiquitous computing applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Peptides have demonstrated unique capabilities to fabricate inorganic nanomaterials of numerous compositions through noncovalent binding of the growing surface in solution. In this contribution, we demonstrate that these biomolecules can control all facets of Au nanoparticle fabrication, including Au3+ reduction, without the use of secondary reagents. In this regard using the AuBP1 peptide, the N-terminal tryptophan residue is responsible for driving Au3+ reduction to generate Au nanoparticles passivated by the oxidized peptide in solution, where localized residue context effects control the reducing strength of the biomolecule. The process was fully monitored by both time-resolved monitoring of the growth of the localized surface plasmon resonance and transmission electron microscopy. Nanoparticle growth occurs by a unique disaggregation of nanoparticle aggregates in solution. Computational modeling demonstrated that the oxidized residue of the peptide sequence does not impact the biomolecule's ability to bind the inorganic surface, as compared to the parent peptide, confirming that the biomolecule can be exploited for all steps in the nanoparticle fabrication process. Overall, these results expand the utility of peptides for the fabrication of inorganic nanomaterials, more strongly mimicking their use in nature via biomineralization processes. Furthermore, these capabilities enhance the simplicity of nanoparticle production and could find rapid use in the generation of complex multicomponent materials or nanoparticle assembly.