982 resultados para LARGE EXTRA DIMENSIONS


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper evaluates the efficiency of a number of popular corpus-based distributional models in performing discovery on very large document sets, including online collections. Literature-based discovery is the process of identifying previously unknown connections from text, often published literature, that could lead to the development of new techniques or technologies. Literature-based discovery has attracted growing research interest ever since Swanson's serendipitous discovery of the therapeutic effects of fish oil on Raynaud's disease in 1986. The successful application of distributional models in automating the identification of indirect associations underpinning literature-based discovery has been heavily demonstrated in the medical domain. However, we wish to investigate the computational complexity of distributional models for literature-based discovery on much larger document collections, as they may provide computationally tractable solutions to tasks including, predicting future disruptive innovations. In this paper we perform a computational complexity analysis on four successful corpus-based distributional models to evaluate their fit for such tasks. Our results indicate that corpus-based distributional models that store their representations in fixed dimensions provide superior efficiency on literature-based discovery tasks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As all-atom molecular dynamics method is limited by its enormous computational cost, various coarse-grained strategies have been developed to extend the length scale of soft matters in the modeling of mechanical behaviors. However, the classical thermostat algorithm in highly coarse-grained molecular dynamics method would underestimate the thermodynamic behaviors of soft matters (e.g. microfilaments in cells), which can weaken the ability of materials to overcome local energy traps in granular modeling. Based on all-atom molecular dynamics modeling of microfilament fragments (G-actin clusters), a new stochastic thermostat algorithm is developed to retain the representation of thermodynamic properties of microfilaments at extra coarse-grained level. The accuracy of this stochastic thermostat algorithm is validated by all-atom MD simulation. This new stochastic thermostat algorithm provides an efficient way to investigate the thermomechanical properties of large-scale soft matters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Accurate three-dimensional representations of cultural heritage sites are highly valuable for scientific study, conservation, and educational purposes. In addition to their use for archival purposes, 3D models enable efficient and precise measurement of relevant natural and architectural features. Many cultural heritage sites are large and complex, consisting of multiple structures spatially distributed over tens of thousands of square metres. The process of effectively digitising such geometrically complex locations requires measurements to be acquired from a variety of viewpoints. While several technologies exist for capturing the 3D structure of objects and environments, none are ideally suited to complex, large-scale sites, mainly due to their limited coverage or acquisition efficiency. We explore the use of a recently developed handheld mobile mapping system called Zebedee in cultural heritage applications. The Zebedee system is capable of efficiently mapping an environment in three dimensions by continually acquiring data as an operator holding the device traverses through the site. The system was deployed at the former Peel Island Lazaret, a culturally significant site in Queensland, Australia, consisting of dozens of buildings of various sizes spread across an area of approximately 400 × 250 m. With the Zebedee system, the site was scanned in half a day, and a detailed 3D point cloud model (with over 520 million points) was generated from the 3.6 hours of acquired data in 2.6 hours. We present results demonstrating that Zebedee was able to accurately capture both site context and building detail comparable in accuracy to manual measurement techniques, and at a greatly increased level of efficiency and scope. The scan allowed us to record derelict buildings that previously could not be measured because of the scale and complexity of the site. The resulting 3D model captures both interior and exterior features of buildings, including structure, materials, and the contents of rooms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Numerous efforts have been dedicated to the synthesis of large-volume methacrylate monoliths for large-scale biomolecules purification but most were obstructed by the enormous release of exotherms during preparation, thereby introducing structural heterogeneity in the monolith pore system. A significant radial temperature gradient develops along the monolith thickness, reaching a terminal temperature that supersedes the maximum temperature required for structurally homogenous monoliths preparation. The enormous heat build-up is perceived to encompass the heat associated with initiator decomposition and the heat released from free radical-monomer and monomer-monomer interactions. The heat resulting from the initiator decomposition was expelled along with some gaseous fumes before commencing polymerization in a gradual addition fashion. Characteristics of 80 mL monolith prepared using this technique was compared with that of a similar monolith synthesized in a bulk polymerization mode. An extra similarity in the radial temperature profiles was observed for the monolith synthesized via the heat expulsion technique. A maximum radial temperature gradient of only 4.3°C was recorded at the center and 2.1°C at the monolith peripheral for the combined heat expulsion and gradual addition technique. The comparable radial temperature distributions obtained birthed identical pore size distributions at different radial points along the monolith thickness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to extend understanding of how large firms pursuing sustained and profitable growth manage organisational renewal. A multiple-case study was conducted in 27 North American and European wood-industry companies, of which 11 were chosen for closer study. The study combined the organisational-capabilities approach to strategic management with corporate-entrepreneurship thinking. It charted the further development of an identification and classification system for capabilities comprising three dimensions: (i) the dynamism between firm-specific and industry-significant capabilities, (ii) hierarchies of capabilities and capability portfolios, and (iii) their internal structure. Capability building was analysed in the context of the organisational design, the technological systems and the type of resource-bundling process (creating new vs. entrenching existing capabilities). The thesis describes the current capability portfolios and the organisational changes in the case companies. It also clarifies the mechanisms through which companies can influence the balance between knowledge search and the efficiency of knowledge transfer and integration in their daily business activities, and consequently the diversity of their capability portfolio and the breadth and novelty of their product/service range. The largest wood-industry companies of today must develop a seemingly dual strategic focus: they have to combine leading-edge, innovative solutions with cost-efficient, large-scale production. The use of modern technology in production was no longer a primary source of competitiveness in the case companies, but rather belonged to the portfolio of basic capabilities. Knowledge and information management had become an industry imperative, on a par with cost effectiveness. Yet, during the period of this research, the case companies were better in supporting growth in volume of the existing activity than growth through new economic activities. Customer-driven, incremental innovation was preferred over firm-driven innovation through experimentation. The three main constraints on organisational renewal were the lack of slack resources, the aim for lean, centralised designs, and the inward-bound communication climate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measurement of individual emission sources (e.g., animals or pen manure) within intensive livestock enterprises is necessary to test emission calculation protocols and to identify targets for decreased emissions. In this study, a vented, fabric-covered large chamber (4.5 × 4.5 m, 1.5 m high; encompassing greater spatial variability than a smaller chamber) in combination with on-line analysis (nitrous oxide [N2O] and methane [CH4] via Fourier Transform Infrared Spectroscopy; 1 analysis min-1) was tested as a means to isolate and measure emissions from beef feedlot pen manure sources. An exponential model relating chamber concentrations to ambient gas concentrations, air exchange (e.g., due to poor sealing with the surface; model linear when ≈ 0 m3 s-1), and chamber dimensions allowed data to be fitted with high confidence. Alternating manure source emission measurements using the large-chamber and the backward Lagrangian stochastic (bLS) technique (5-mo period; bLS validated via tracer gas release, recovery 94-104%) produced comparable N2O and CH4 emission values (no significant difference at P < 0.05). Greater precision of individual measurements was achieved via the large chamber than for the bLS (mean ± standard error of variance components: bLS half-hour measurements, 99.5 ± 325 mg CH4 s-1 and 9.26 ± 20.6 mg N2O s-1; large-chamber measurements, 99.6 ± 64.2 mg CH4 s-1 and 8.18 ± 0.3 mg N2O s-1). The large-chamber design is suitable for measurement of emissions from manure on pen surfaces, isolating these emissions from surrounding emission sources, including enteric emissions. © © American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present a low-complexity algorithm for detection in high-rate, non-orthogonal space-time block coded (STBC) large-multiple-input multiple-output (MIMO) systems that achieve high spectral efficiencies of the order of tens of bps/Hz. We also present a training-based iterative detection/channel estimation scheme for such large STBC MIMO systems. Our simulation results show that excellent bit error rate and nearness-to-capacity performance are achieved by the proposed multistage likelihood ascent search (M-LAS) detector in conjunction with the proposed iterative detection/channel estimation scheme at low complexities. The fact that we could show such good results for large STBCs like 16 X 16 and 32 X 32 STBCs from Cyclic Division Algebras (CDA) operating at spectral efficiencies in excess of 20 bps/Hz (even after accounting for the overheads meant for pilot based training for channel estimation and turbo coding) establishes the effectiveness of the proposed detector and channel estimator. We decode perfect codes of large dimensions using the proposed detector. With the feasibility of such a low-complexity detection/channel estimation scheme, large-MIMO systems with tens of antennas operating at several tens of bps/Hz spectral efficiencies can become practical, enabling interesting high data rate wireless applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Batch Processing Machine (BPM) is one which processes a number of jobs simultaneously as a batch with common beginning and ending times. Also, a BPM, once started cannot be interrupted in between (Pre-emption not allowed). This research is motivated by a BPM in steel casting industry. There are three main stages in any steel casting industry viz., pre-casting stage, casting stage and post-casting stage. A quick overview of the entire process, is shown in Figure 1. There are two BPMs : (1) Melting furnace in the pre-casting stage and (2) Heat Treatment Furnace (HTF) in the post casting stage of steel casting manufacturing process. This study focuses on scheduling the latter, namely HTF. Heat-treatment operation is one of the most important stages of steel casting industries. It determines the final properties that enable components to perform under demanding service conditions such as large mechanical load, high temperature and anti-corrosive processing. In general, different types of castings have to undergo more than one type of heat-treatment operations, where the total heat-treatment processing times change. To have a better control, castings are primarily classified into a number of job-families based on the alloy type such as low-alloy castings and high alloy castings. For technical reasons such as type of alloy, temperature level and the expected combination of heat-treatment operations, the castings from different families can not be processed together in the same batch.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Non-orthogonal space-time block codes (STBC) from cyclic division algebras (CDA) are attractive because they can simultaneously achieve both high spectral efficiencies (same spectral efficiency as in V-BLAST for a given number of transmit antennas) as well as full transmit diversity. Decoding of non-orthogonal STBCs with hundreds of dimensions has been a challenge. In this paper, we present a probabilistic data association (PDA) based algorithm for decoding non-orthogonal STBCs with large dimensions. Our simulation results show that the proposed PDA-based algorithm achieves near SISO AWGN uncoded BER as well as near-capacity coded BER (within 5 dB of the theoretical capacity) for large non-orthogonal STBCs from CDA. We study the effect of spatial correlation on the BER, and show that the performance loss due to spatial correlation can be alleviated by providing more receive spatial dimensions. We report good BER performance when a training-based iterative decoding/channel estimation is used (instead of assuming perfect channel knowledge) in channels with large coherence times. A comparison of the performances of the PDA algorithm and the likelihood ascent search (LAS) algorithm (reported in our recent work) is also presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper,we present a belief propagation (BP) based algorithm for decoding non-orthogonal space-time block codes (STBC) from cyclic division algebras (CDA) having large dimensions. The proposed approachinvolves message passing on Markov random field (MRF) representation of the STBC MIMO system. Adoption of BP approach to decode non-orthogonal STBCs of large dimensions has not been reported so far. Our simulation results show that the proposed BP-based decoding achieves increasingly closer to SISO AWGN performance for increased number of dimensions. In addition, it also achieves near-capacity turbo coded BER performance; for e.g., with BP decoding of 24 x 24 STBC from CDA using BPSK (i.e.,n576 real dimensions) and rate-1/2 turbo code (i.e., 12 bps/Hz spectral efficiency), coded BER performance close to within just about 2.5 dB from the theoretical MIMO capacity is achieved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is shown that the euclideanized Yukawa theory, with the Dirac fermion belonging to an irreducible representation of the Lorentz group, is not bounded from below. A one parameter family of supersymmetric actions is presented which continuously interpolates between the N = 2 SSYM and the N = 2 supersymmetric topological theory. In order to obtain a theory which is bounded from below and satisfies Osterwalder-Schrader positivity, the Dirac fermion should belong to a reducible representation of the Lorentz group and the scalar fields have to be reinterpreted as the extra components of a higher dimensional vector field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nucleation at large metastability is still largely an unsolved problem, even though it is a problem of tremendous current interest, with wide-ranging practical value, from atmospheric research to materials science. It is now well accepted that the classical nucleation theory (CNT) fails to provide a qualitative picture and gives incorrect quantitative values for such quantities as activation-free energy barrier and supersaturation dependence of nucleation rate, especially at large metastability. In this paper, we present an alternative formalism to treat nucleation at large supersaturation by introducing an extended set of order parameters in terms of the kth largest liquid-like clusters, where k = 1 is the largest cluster in the system, k = 2 is the second largest cluster and so on. At low supersaturation, the size of the largest liquid-like cluster acts as a suitable order parameter. At large supersaturation, the free energy barrier for the largest liquid-like cluster disappears. We identify this supersaturation as the one at the onset of kinetic spinodal. The kinetic spinodal is system-size-dependent. Beyond kinetic spinodal many clusters grow simultaneously and competitively and hence the nucleation and growth become collective. In order to describe collective growth, we need to consider the full set of order parameters. We derive an analytic expression for the free energy of formation of the kth largest cluster. The expression predicts that, at large metastability (beyond kinetic spinodal), the barrier of growth for several largest liquid-like clusters disappears, and all these clusters grow simultaneously. The approach to the critical size occurs by barrierless diffusion in the cluster size space. The expression for the rate of barrier crossing predicts weaker supersaturation dependence than what is predicted by CNT at large metastability. Such a crossover behavior has indeed been observed in recent experiments (but eluded an explanation till now). In order to understand the large numerical discrepancy between simulation predictions and experimental results, we carried out a study of the dependence on the range of intermolecular interactions of both the surface tension of an equilibrium planar gas-liquid interface and the free energy barrier of nucleation. Both are found to depend significantly on the range of interaction for the Lennard-Jones potential, both in two and three dimensions. The value of surface tension and also the free energy difference between the gas and the liquid phase increase significantly and converge only when the range of interaction is extended beyond 6-7 molecular diameters. We find, with the full range of interaction potential, that the surface tension shows only a weak dependence on supersaturation, so the reason for the breakdown of CNT (with simulated values of surface tension and free energy gap) cannot be attributed to the supersaturation dependence of surface tension. This remains an unsettled issue at present because of the use of the value of surface tension obtained at coexistence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although the sunspots migrate towards the equator, the large-scale weak diffuse magnetic fields of the Sun migrate poleward with the solar cycle, the polar field reversing at the time of the sunspot maxima. We apply the vector model of Dikpati and Choudhuri (1994, Paper I) to fit these observations. The dynamo layer at the base of the convection zone is taken to be the source of the diffuse field, which is then evolved in the convection zone subject to meridional circulation and turbulent diffusion. We find that the longitudinally averaged observational data can be fitted reasonably well both for positive and negative values of the alpha-effect by adjusting the subsurface meridional flow suitably. The model will be extended in a future paper to include the decay of active regions as an extra source of the diffuse field, which may be necessary to explain the probable phase lag between B-tau and B-phi at lower latitudes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We discuss a many-body Hamiltonian with two- and three-body interactions in two dimensions introduced recently by Murthy, Bhaduri and Sen. Apart from an analysis of some exact solutions in the many-body system, we analyse in detail the two-body problem which is completely solvable. We show that the solution of the two-body problem reduces to solving a known differential equation due to Heun. We show that the two-body spectrum becomes remarkably simple for large interaction strengths and the level structure resembles that of the Landau levels. We also clarify the 'ultraviolet' regularization which is needed to define an inverse-square potential properly and discuss its implications for our model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we consider the application of belief propagation (BP) to achieve near-optimal signal detection in large multiple-input multiple-output (MIMO) systems at low complexities. Large-MIMO architectures based on spatial multiplexing (V-BLAST) as well as non-orthogonal space-time block codes(STBC) from cyclic division algebra (CDA) are considered. We adopt graphical models based on Markov random fields (MRF) and factor graphs (FG). In the MRF based approach, we use pairwise compatibility functions although the graphical models of MIMO systems are fully/densely connected. In the FG approach, we employ a Gaussian approximation (GA) of the multi-antenna interference, which significantly reduces the complexity while achieving very good performance for large dimensions. We show that i) both MRF and FG based BP approaches exhibit large-system behavior, where increasingly closer to optimal performance is achieved with increasing number of dimensions, and ii) damping of messages/beliefs significantly improves the bit error performance.