969 resultados para Massive Parallelization


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The progenitors of many Type II core-collapse supernovae (SNe) have now been identified directly on pre-discovery imaging. Here, we present an extensive search for the progenitors of Type Ibc SNe in all available pre-discovery imaging since 1998. There are 12 Type Ibc SNe with no detections of progenitors in either deep ground-based or Hubble Space Telescope archival imaging. The deepest absolute BVR magnitude limits are between -4 and - 5 mag. We compare these limits with the observed Wolf-Rayet population in the Large Magellanic Cloud and estimate a 16 per cent probability that we have failed to detect such a progenitor by chance. Alternatively, the progenitors evolve significantly before core-collapse or we have underestimated the extinction towards the progenitors. Reviewing the relative rates and ejecta mass estimates from light-curve modelling of Ibc SNe, we find both incompatible with Wolf-Rayet stars with initial masses >25 M⊙ being the only progenitors. We present binary evolution models that fit these observational constraints. Stars in binaries with initial masses ≲ 20 M⊙ lose their hydrogen envelopes in binary interactions to become low-mass helium stars. They retain a low-mass hydrogen envelope until ≈104 yr before core-collapse; hence, it is not surprising that Galactic analogues have been difficult to identify.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In an early-type, massive star binary system, X-ray bright shocks result from the powerful collision of stellar winds driven by radiation pressure on spectral line transitions. We examine the influence of the X-rays from the wind-wind collision shocks on the radiative driving of the stellar winds using steady-state models that include a parameterized line force with X-ray ionization dependence. Our primary result is that X-ray radiation from the shocks inhibits wind acceleration and can lead to a lower pre-shock velocity, and a correspondingly lower shocked plasma temperature, yet the intrinsic X-ray luminosity of the shocks, L X, remains largely unaltered, with the exception of a modest increase at small binary separations. Due to the feedback loop between the ionizing X-rays from the shocks and the wind driving, we term this scenario as self-regulated shocks. This effect is found to greatly increase the range of binary separations at which a wind-photosphere collision is likely to occur in systems where the momenta of the two winds are significantly different. Furthermore, the excessive levels of X-ray ionization close to the shocks completely suppress the line force, and we suggest that this may render radiative braking less effective. Comparisons of model results against observations reveal reasonable agreement in terms of log (L X/L bol). The inclusion of self-regulated shocks improves the match for kT values in roughly equal wind momenta systems, but there is a systematic offset for systems with unequal wind momenta (if considered to be a wind-photosphere collision).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the achievable sum-rate of massive multiple-input multiple-output (MIMO) systems in the presence of channel aging. For the uplink, by assuming that the base station (BS) deploys maximum ratio combining (MRC) or zero-forcing (ZF) receivers, we present tight closed-form lower bounds on the achievable sum-rate for both receivers with aged channel state information (CSI). In addition, the benefit of implementing channel prediction methods on the sum-rate is examined, and closed-form sum rate lower bounds are derived. Moreover, the impact of channel aging and channel prediction on the power scaling law is characterized. Extension to the downlink scenario and multi-cell scenario are also considered. It is found that, for a system with/without channel prediction, the transmit power of each user can be scaled down at most by 1= p M (where M is the number of BS antennas), which indicates that aged CSI does not degrade the power scaling law, and channel prediction does not enhance the power scaling law; instead, these phenomena affect the achievable sum-rate by degrading or enhancing the effective signal to interference and noise ratio, respectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This letter investigates the uplink spectral efficiency (SE) of a two-tier cellular network, where massive multiple-input multiple-output macro base stations are overlaid with dense small cells. Macro user equipments (MUEs) and small cells with single user equipment uniformly scattered are modeled as two independent homogeneous Poisson point processes. By applying stochastic geometry, we analyze the SE of the multiuser uplink at a macro base station that employs a zero-forcing detector and we obtain a novel lower bound as well as its approximation. According to the simple and near-exact analytical expression, we observe that the ideal way to improve the SE is by increasing the MUE density and the base station antennas synchronously rather than increasing them individually. Furthermore, a large value of path loss exponent has a positive effect on the SE due to the reduced aggregated interference.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We investigate the cell coverage optimization problem for the massive multiple-input multiple-output (MIMO) uplink. By deploying tilt-adjustable antenna arrays at the base stations, cell coverage optimization can become a promising technique which is able to strike a compromise between covering cell-edge users and pilot contamination suppression. We formulate a detailed description of this optimization problem by maximizing the cell throughput, which is shown to be mainly determined by the user distribution within several key geometrical regions. Then, the formulated problem is applied to different example scenarios: for a network with hexagonal shaped cells and uniformly distributed users, we derive an analytical lower bound of the ergodic throughput in the objective cell, based on which, it is shown that the optimal choice for the cell coverage should ensure that the coverage of different cells does not overlap; for a more generic network with sectoral shaped cells and non-uniformly distributed users, we propose an analytical approximation of the ergodic throughput. After that, a practical coverage optimization algorithm is proposed, where the optimal solution can be easily obtained through a simple one-dimensional line searching within a confined searching region. Our numerical results show that the proposed coverage optimization method is able to greatly increase the system throughput in macrocells for the massive MIMO uplink transmission, compared with the traditional schemes where the cell coverage is fixed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider a multi-pair two-way amplify-and-forward relaying system with a massive antenna array at the relay and estimated channel state information, assuming maximum-ratio combining/transmission processing. Closed-form approximations of the sum spectral effi- ciency are developed and simple analytical power scaling laws are presented, which reveal a fundamental trade-off between the transmit powers of each user/the relay and of each pilot symbol. Finally, the optimal power allocation problem is studied.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study multicarrier multiuser multiple-input multiple-output (MU-MIMO) systems, in which the base station employs an asymptotically large number of antennas. We analyze a fully correlated channel matrix and provide a beam domain channel model, where the channel gains are independent of sub-carriers. For this model, we first derive a closed-form upper bound on the achievable ergodic sum-rate, based on which, we develop asymptotically necessary and sufficient conditions for optimal downlink transmission that require only statistical channel state information at the transmitter. Furthermore, we propose a beam division multiple access (BDMA) transmission scheme that simultaneously serves multiple users via different beams. By selecting users within non-overlapping beams, the MU-MIMO channels can be equivalently decomposed into multiple single-user MIMO channels; this scheme significantly reduces the overhead of channel estimation, as well as, the processing complexity at transceivers. For BDMA transmission, we work out an optimal pilot design criterion to minimize the mean square error (MSE) and provide optimal pilot sequences by utilizing the Zadoff-Chu sequences. Simulations demonstrate the near-optimal performance of BDMA transmission and the advantages of the proposed pilot sequences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Seafloor massive sulfides (SMS) contain commercially viable quantities of high grade ores, making them attractive prospect sites for marine mining. SMS deposits may also contain hydrothermal vent ecosystems populated by high conservation value vent-endemic species. Responsible environmental management of these resources is best achieved by the adoption of a precautionary approach. Part of this precautionary approach involves the Environmental Impact Assessment (EIA) of exploration and exploitative activities at SMS deposits. The VentBase 2012 workshop provided a forum for stakeholders and scientists to discuss issues surrounding SMS exploration and exploitation. This forum recognised the requirement for a primer which would relate concepts underpinning EIA at SMS deposits. The purpose of this primer is to inform policy makers about EIA at SMS deposits in order to aid management decisions. The primer offers a basic introduction to SMS deposits and their associated ecology, and the basic requirements for EIA at SMS deposits; including initial data and information scoping, environmental survey, and ecological risk assessment. © 2013 Elsevier Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Strategies for mitigation of seafloor massive sulphide (SMS) extraction in the deep sea include establishment of suitable reference sites that allow for studies of natural environmental variability and that can serve as sources of larvae for re-colonisation of extracted hydrothermal fields. In this study, we characterize deep-sea vent communities in Manus Basin (Bismarck Sea, Papua New Guinea) and use macrofaunal data sets from a proposed reference site (South Su) and a proposed mine site (Solwara 1) to test the hypothesis that there was no difference in macrofaunal community structure between the sites. We used dispersion weighting to adjust taxa-abundance matrices to down-weight the contribution of contagious distributions of numerically abundant taxa. Faunal assemblages of 3 habitat types defined by biogenic taxa (2 provannid snails, Alviniconcha spp. and Ifremeria nautilei; and a sessile barnacle, Eochionelasmus ohtai) were distinct from one another and from the vent peripheral assemblage, but were not differentiable from mound-to-mound within a site or between sites. Mussel and tubeworm populations at South Su but not at Solwara 1 enhance the taxonomic and habitat diversity of the proposed reference site. © Inter-Research 2012.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Seafloor massive sulfide (SMS) mining will likely occur at hydrothermal systems in the near future. Alongside their mineral wealth, SMS deposits also have considerable biological value. Active SMS deposits host endemic hydrothermal vent communities, whilst inactive deposits support communities of deep water corals and other suspension feeders. Mining activities are expected to remove all large organisms and suitable habitat in the immediate area, making vent endemic organisms particularly at risk from habitat loss and localised extinction. As part of environmental management strategies designed to mitigate the effects of mining, areas of seabed need to be protected to preserve biodiversity that is lost at the mine site and to preserve communities that support connectivity among populations of vent animals in the surrounding region. These "set-aside" areas need to be biologically similar to the mine site and be suitably connected, mostly by transport of larvae, to neighbouring sites to ensure exchange of genetic material among remaining populations. Establishing suitable set-asides can be a formidable task for environmental managers, however the application of genetic approaches can aid set-aside identification, suitability assessment and monitoring. There are many genetic tools available, including analysis of mitochondrial DNA (mtDNA) sequences (e.g. COI or other suitable mtDNA genes) and appropriate nuclear DNA markers (e.g. microsatellites, single nucleotide polymorphisms), environmental DNA (eDNA) techniques and microbial metagenomics. When used in concert with traditional biological survey techniques, these tools can help to identify species, assess the genetic connectivity among populations and assess the diversity of communities. How these techniques can be applied to set-aside decision making is discussed and recommendations are made for the genetic characteristics of set-aside sites. A checklist for environmental regulators forms a guide to aid decision making on the suitability of set-aside design and assessment using genetic tools. This non-technical primer document represents the views of participants in the VentBase 2014 workshop.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The introduction of parallel processing architectures allowed the real time impelemtation of more sophisticated control algorithms with tighter specifications in terms of sampling time. However, to take advantage of the processing power of these architectures the control engeneer, due to the lack of appropriate tools, must spend a considerable amount of time in the parallelizaton of the control algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertation presented to obtain the PhD degree in Biology/Molecular Biology by Universidade Nova de Lisboa, Instituto de Tecnologia Química e Biológica

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Teaching and learning computer programming is as challenging as difficult. Assessing the work of students and providing individualised feedback to all is time-consuming and error prone for teachers and frequently involves a time delay. The existent tools and specifications prove to be insufficient in complex evaluation domains where there is a greater need to practice. At the same time Massive Open Online Courses (MOOC) are appearing revealing a new way of learning, more dynamic and more accessible. However this new paradigm raises serious questions regarding the monitoring of student progress and its timely feedback. This paper provides a conceptual design model for a computer programming learning environment. This environment uses the portal interface design model gathering information from a network of services such as repositories and program evaluators. The design model includes also the integration with learning management systems, a central piece in the MOOC realm, endowing the model with characteristics such as scalability, collaboration and interoperability. This model is not limited to the domain of computer programming and can be adapted to any complex area that requires systematic evaluation with immediate feedback.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we investigate various algorithms for performing Fast Fourier Transformation (FFT)/Inverse Fast Fourier Transformation (IFFT), and proper techniquesfor maximizing the FFT/IFFT execution speed, such as pipelining or parallel processing, and use of memory structures with pre-computed values (look up tables -LUT) or other dedicated hardware components (usually multipliers). Furthermore, we discuss the optimal hardware architectures that best apply to various FFT/IFFT algorithms, along with their abilities to exploit parallel processing with minimal data dependences of the FFT/IFFT calculations. An interesting approach that is also considered in this paper is the application of the integrated processing-in-memory Intelligent RAM (IRAM) chip to high speed FFT/IFFT computing. The results of the assessment study emphasize that the execution speed of the FFT/IFFT algorithms is tightly connected to the capabilities of the FFT/IFFT hardware to support the provided parallelism of the given algorithm. Therefore, we suggest that the basic Discrete Fourier Transform (DFT)/Inverse Discrete Fourier Transform (IDFT) can also provide high performances, by utilizing a specialized FFT/IFFT hardware architecture that can exploit the provided parallelism of the DFT/IDF operations. The proposed improvements include simplified multiplications over symbols given in polar coordinate system, using sinе and cosine look up tables,and an approach for performing parallel addition of N input symbols.