971 resultados para space-based lasers


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Vector space models (VSMs) represent word meanings as points in a high dimensional space. VSMs are typically created using a large text corpora, and so represent word semantics as observed in text. We present a new algorithm (JNNSE) that can incorporate a measure of semantics not previously used to create VSMs: brain activation data recorded while people read words. The resulting model takes advantage of the complementary strengths and weaknesses of corpus and brain activation data to give a more complete representation of semantics. Evaluations show that the model 1) matches a behavioral measure of semantics more closely, 2) can be used to predict corpus data for unseen words and 3) has predictive power that generalizes across brain imaging technologies and across subjects. We believe that the model is thus a more faithful representation of mental vocabularies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The recent development of the massive multiple-input multiple-output (MIMO) paradigm, has been extensively based on the pursuit of favorable propagation: in the asymptotic limit, the channel vectors become nearly orthogonal and interuser interference tends to zero [1]. In this context, previous studies
have considered fixed inter-antenna distance, which implies an increasing array aperture as the number of elements increases. Here, we focus on a practical, space-constrained topology, where an increase in the number of antenna elements in a fixed total space imposes an inversely proportional decrease in the inter-antenna distance. Our analysis shows that, contrary to existing studies, inter-user interference does not vanish in the massive MIMO regime, thereby creating a saturation effect on the achievable rate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ultra-intense lasers can nowadays routinely accelerate kiloampere ion beams. These unique sources of particle beams could impact many societal (e.g., proton-therapy or fuel recycling) and fundamental (e.g., neutron probing) domains. However, this requires overcoming the beam angular divergence at the source. This has been attempted, either with large-scale conventional setups or with compact plasma techniques that however have the restriction of short (<1 mm) focusing distances or a chromatic behavior. Here, we show that exploiting laser-triggered, long-lasting (>50 ps), thermoelectric multi-megagauss surface magnetic (B)-fields, compact capturing, and focusing of a diverging laser-driven multi-MeV ion beam can be achieved over a wide range of ion energies in the limit of a 5° acceptance angle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The design cycle for complex special-purpose computing systems is extremely costly and time-consuming. It involves a multiparametric design space exploration for optimization, followed by design verification. Designers of special purpose VLSI implementations often need to explore parameters, such as optimal bitwidth and data representation, through time-consuming Monte Carlo simulations. A prominent example of this simulation-based exploration process is the design of decoders for error correcting systems, such as the Low-Density Parity-Check (LDPC) codes adopted by modern communication standards, which involves thousands of Monte Carlo runs for each design point. Currently, high-performance computing offers a wide set of acceleration options that range from multicore CPUs to Graphics Processing Units (GPUs) and Field Programmable Gate Arrays (FPGAs). The exploitation of diverse target architectures is typically associated with developing multiple code versions, often using distinct programming paradigms. In this context, we evaluate the concept of retargeting a single OpenCL program to multiple platforms, thereby significantly reducing design time. A single OpenCL-based parallel kernel is used without modifications or code tuning on multicore CPUs, GPUs, and FPGAs. We use SOpenCL (Silicon to OpenCL), a tool that automatically converts OpenCL kernels to RTL in order to introduce FPGAs as a potential platform to efficiently execute simulations coded in OpenCL. We use LDPC decoding simulations as a case study. Experimental results were obtained by testing a variety of regular and irregular LDPC codes that range from short/medium (e.g., 8,000 bit) to long length (e.g., 64,800 bit) DVB-S2 codes. We observe that, depending on the design parameters to be simulated, on the dimension and phase of the design, the GPU or FPGA may suit different purposes more conveniently, thus providing different acceleration factors over conventional multicore CPUs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a multiloop robust control strategy is proposed based on H∞ control and a partial least squares (PLS) model (H∞_PLS) for multivariable chemical processes. It is developed especially for multivariable systems in ill-conditioned plants and non-square systems. The advantage of PLS is to extract the strongest relationship between the input and the output variables in the reduced space of the latent variable model rather than in the original space of the highly dimensional variables. Without conventional decouplers, the dynamic PLS framework automatically decomposes the MIMO process into multiple single-loop systems in the PLS subspace so that the controller design can be simplified. Since plant/model mismatch is almost inevitable in practical applications, to enhance the robustness of this control system, the controllers based on the H∞ mixed sensitivity problem are designed in the PLS latent subspace. The feasibility and the effectiveness of the proposed approach are illustrated by the simulation results of a distillation column and a mixing tank process. Comparisons between H∞_PLS control and conventional individual control (either H∞ control or PLS control only) are also made

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a continuous time Markov chain (CTMC) based sequential analytical approach for composite generation and transmission systems reliability assessment. The basic idea is to construct a CTMC model for the composite system. Based on this model, sequential analyses are performed. Various kinds of reliability indices can be obtained, including expectation, variance, frequency, duration and probability distribution. In order to reduce the dimension of the state space, traditional CTMC modeling approach is modified by merging all high order contingencies into a single state, which can be calculated by Monte Carlo simulation (MCS). Then a state mergence technique is developed to integrate all normal states to further reduce the dimension of the CTMC model. Moreover, a time discretization method is presented for the CTMC model calculation. Case studies are performed on the RBTS and a modified IEEE 300-bus test system. The results indicate that sequential reliability assessment can be performed by the proposed approach. Comparing with the traditional sequential Monte Carlo simulation method, the proposed method is more efficient, especially in small scale or very reliable power systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is important to be able to assess the contribution of donor cells to the graft followmg bone marrow transplantation (BMT), as complete engraftment of marrow progenitors that can give rise to long term donor derived hemopoiesis may be important in long-term disease-free survival. The contribution of the donor marrow, both in terms of filling the marrow "space" created by the intense conditioning regimen and in its ability to mediate a graft versus leukemia effect may be assessed by studying the kinetics of the engraftment process. As BMT involves repopulation of the host hemopoietic system with donor cells, recipients of allogeneic marrow are referred to as hemopoietic chimeras. A donor chimera is an individual who exhibits complete donor hemopoiesis and we would imagine that donor chimertsm carries the best long-term prognosis. A patient who has both donor and recipient cells coexistmg in a stable fashion post-BMT without hematological evidence of relapse or graft rejection is referred to as a mixed chimera. Mixed chimerism may be a prelude to graft rejection or leukemic relapse; therefore, it is important to be able to monitor the presence of these cells in a precise manner.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Environmental problems, especially climate change, have become a serious global issue waiting for people to solve. In the construction industry, the concept of sustainable building is developing to reduce greenhouse gas emissions. In this study, a building information modeling (BIM) based building design optimization method is proposed to facilitate designers to optimize their designs and improve buildings’ sustainability. A revised particle swarm optimization (PSO) algorithm is applied to search for the trade-off between life cycle costs (LCC) and life cycle carbon emissions (LCCE) of building designs. In order tovalidate the effectiveness and efficiency of this method, a case study of an office building is conducted in Hong Kong. The result of the case study shows that this method can enlarge the searching space for optimal design solutions and shorten the processing time for optimal design results, which is really helpful for designers to deliver an economic and environmental friendly design scheme.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Surface plasmon polaritons usually exist on a few suitable plasmonic materials; however, nanostructured plasmonic metamaterials allow a much broader range of optical properties to be designed. Here, bottom-up and top-down nanostructuring are combined, creating hyperbolic metamaterial-based photonic crystals termed hyperbolic polaritonic crystals, allowing free-space access to the high spatial frequency modes supported by these metamaterials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most traditional data mining algorithms struggle to cope with the sheer scale of data efficiently. In this paper, we propose a general framework to accelerate existing clustering algorithms to cluster large-scale datasets which contain large numbers of attributes, items, and clusters. Our framework makes use of locality sensitive hashing (LSH) to significantly reduce the cluster search space. We also theoretically prove that our framework has a guaranteed error bound in terms of the clustering quality. This framework can be applied to a set of centroid-based clustering algorithms that assign an object to the most similar cluster, and we adopt the popular K-Modes categorical clustering algorithm to present how the framework can be applied. We validated our framework with five synthetic datasets and a real world Yahoo! Answers dataset. The experimental results demonstrate that our framework is able to speed up the existing clustering algorithm between factors of 2 and 6, while maintaining comparable cluster purity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports on the solubility and diffusivity of dissolved oxygen in a series of ionic liquids (ILs) based on the bis{(trifluoromethyl)sulfonyl}imide anion with a range of related alkyl and ether functionalised cyclic alkylammonium cations. Cyclic voltammetry has been used to observe the reduction of oxygen in ILs at a microdisk electrode and chronoamperometric measurements have then been applied to simultaneously determine both the concentration and the diffusion coefficient of oxygen in the different ILs. The viscosity of the ILs and the calculated molar volume and free volume is also reported. It is found that, within this class of ILs, the oxygen diffusivity generally increases with decreasing viscosity of the neat IL. An inverse relationship between oxygen solubility and IL free volume is reported for the two IL families implying oxygen is not simply occupying the available empty space. In addition, it is reported that the introduction of ether-group into the IL cation structure promotes the diffusivity of dissolved oxygen but reduces the solubility of the gas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Real-space grids are a powerful alternative for the simulation of electronic systems. One of the main advantages of the approach is the flexibility and simplicity of working directly in real space where the different fields are discretized on a grid, combined with competitive numerical performance and great potential for parallelization. These properties constitute a great advantage at the time of implementing and testing new physical models. Based on our experience with the Octopus code, in this article we discuss how the real-space approach has allowed for the recent development of new ideas for the simulation of electronic systems. Among these applications are approaches to calculate response properties, modeling of photoemission, optimal control of quantum systems, simulation of plasmonic systems, and the exact solution of the Schrödinger equation for low-dimensionality systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High power lasers have proven being capable to produce high energy γ-rays, charged particles and neutrons, and to induce all kinds of nuclear reactions. At ELI, the studies with high power lasers will enter for the first time into new domains of power and intensities: 10 PW and 10^23 W/cm^2. While the development of laser based radiation sources is the main focus at the ELI-Beamlines pillar of ELI, at ELI-NP the studies that will benefit from High Power Laser System pulses will focus on Laser Driven Nuclear Physics (this TDR, acronym LDNP, associated to the E1 experimental area), High Field Physics and QED (associated to the E6 area) and fundamental research opened by the unique combination of the two 10 PW laser pulses with a gamma beam provided by the Gamma Beam System (associated to E7 area). The scientific case of the LDNP TDR encompasses studies of laser induced nuclear reactions, aiming for a better understanding of nuclear properties, of nuclear reaction rates in laser-plasmas, as well as on the development of radiation source characterization methods based on nuclear techniques. As an example of proposed studies: the promise of achieving solid-state density bunches of (very) heavy ions accelerated to about 10 MeV/nucleon through the RPA mechanism will be exploited to produce highly astrophysical relevant neutron rich nuclei around the N~126 waiting point, using the sequential fission-fusion scheme, complementary to any other existing or planned method of producing radioactive nuclei.

The studies will be implemented predominantly in the E1 area of ELI-NP. However, many of them can be, in a first stage, performed in the E5 and/or E4 areas, where higher repetition laser pulses are available, while the harsh X-ray and electromagnetic pulse (EMP) environments are less damaging compared to E1.

A number of options are discussed through the document, having an important impact on the budget and needed resources. Depending on the TDR review and subsequent project decisions, they may be taken into account for space reservation, while their detailed design and implementation will be postponed.

The present TDR is the result of contributions from several institutions engaged in nuclear physics and high power laser research. A significant part of the proposed equipment can be designed, and afterwards can be built, only in close collaboration with (or subcontracting to) some of these institutions. A Memorandum of Understanding (MOU) is currently under preparation with each of these key partners as well as with others that are interested to participate in the design or in the future experimental program.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Urothelial cancer (UC) is highly recurrent and can progress from non-invasive (NMIUC) to a more aggressive muscle-invasive (MIUC) subtype that invades the muscle tissue layer of the bladder. We present a proof of principle study that network-based features of gene pairs can be used to improve classifier performance and the functional analysis of urothelial cancer gene expression data. In the first step of our procedure each individual sample of a UC gene expression dataset is inflated by gene pair expression ratios that are defined based on a given network structure. In the second step an elastic net feature selection procedure for network-based signatures is applied to discriminate between NMIUC and MIUC samples. We performed a repeated random subsampling cross validation in three independent datasets. The network signatures were characterized by a functional enrichment analysis and studied for the enrichment of known cancer genes. We observed that the network-based gene signatures from meta collections of proteinprotein interaction (PPI) databases such as CPDB and the PPI databases HPRD and BioGrid improved the classification performance compared to single gene based signatures. The network based signatures that were derived from PPI databases showed a prominent enrichment of cancer genes (e.g., TP53, TRIM27 and HNRNPA2Bl). We provide a novel integrative approach for large-scale gene expression analysis for the identification and development of novel diagnostical targets in bladder cancer. Further, our method allowed to link cancer gene associations to network-based expression signatures that are not observed in gene-based expression signatures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

O desenvolvimento de betões de elevado desempenho, durante o início da década de 80, revelou que este tipo particular de materiais com base em cimento é susceptível a problemas de cura. São bem conhecidos os efeitos dos fenómenos autogéneos em sistemas de elevado desempenho com base em cimento, nomeadamente a fissuração em idade jovem. Esta é, aliás vista como a maior limitação no desenvolvimento de novos materiais com durabilidade superior. Desenvolvimentos recentes de métodos de cura interna provaram ser uma boa estratégia de mitigação dos efeitos da auto-dissecação destes sistemas, onde a presente tese ganha o seu espaço no tempo. Este estudo centra-se essencialmente em sistemas de elevado desempenho com base em cimento com cura interna através de partículas superabsorventes, dando particular importância à alteração de volume em idade jovem. Da análise mais aprofundada deste método, resultam algumas limitações na sua aplicabilidade, especialmente em sistemas modificados com sílica de fumo. Conclui-se que a natureza física e química dos polímeros superabsorventes pode afectar significativamente a eficiência da cura interna. Em adição, os mecanismos de cura interna são discutidos mais profundamente, sendo que para além dos mecanismos baseados em fenómenos físicos e químicos, parecem existir efeitos mecânicos significativos. Várias técnicas foram utilizadas durante o decorrer desta investigação, com o objectivo, para além da caracterização de certas propriedades dos materiais, de perseguir as questões deixadas em aberto pela comunidade internacional, relativamente aos mecanismos que fundamentam a explicação dos fenómenos autogéneos. Como exemplo, são apresentados os estudos sobre hidratação dos sistemas para avaliação do problema numa escala microscópica, em vez de macroscópica. Uma nova técnica de cura interna emerge da investigação, baseada na utilização de agregados finos como veiculo para mitigar parcialmente a retracção autogénea. Até aqui, esta técnica não encontra par em investigação anterior, mas a extensão da cura interna ou a eficácia na mitigação baseada neste conceito encontra algumas limitações. A significância desta técnica em prevenir a micro fissuração é um aspecto que está ainda em aberto, mas pode concluir-se que os agregados finos podem ser benéficos na redução dos efeitos da restrição localizada no sistema, reduzindo o risco de micro fissuração. A utilização combinada de partículas finas de agregado e polímeros super absorventes pode ter como consequência betão sem microfissuração, ou pelo menos com nanofissuração.