993 resultados para Direct search


Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is abundant evidence for large amounts of unseen matter in the universe. This dark matter, by its very nature, couples feebly to ordinary matter and is correspondingly difficult to detect. Nonetheless, several experiments are now underway with the sensitivity required to detect directly galactic halo dark matter through their interactions with matter and radiation. These experiments divide into two broad classes: searches for weakly interacting massive particles (WIMPs) and searches for axions. There exists a very strong theoretical bias for supposing that supersymmetry (SUSY) is a correct description of nature. WIMPs are predicted by this SUSY theory and have the required properties to be dark matter. These WIMPs are detected from the byproducts of their occasional recoil against nucleons. There are efforts around the world to detect these rare recoils. The WIMP part of this overview focuses on the cryogenic dark matter search (CDMS) underway in California. Axions, another favored dark matter candidate, are predicted to arise from a minimal extension of the standard model that explains the absence of the expected large CP violating effects in strong interactions. Axions can, in the presence of a large magnetic field, turn into microwave photons. It is the slight excess of photons above noise that signals the axion. Axion searches are underway in California and Japan. The axion part of this overview focuses on the California effort. Brevity does not allow me to discuss other WIMP and axion searches, likewise for accelerator and satellite based searches; I apologize for their omission.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Like other regions of the world, the EU is developing biofuels in the transport sector to reduce oil consumption and mitigate climate change. To promote them, it has adopted favourable legislation since the 2000s. In 2009 it even decided to oblige each Member State to ensure that by 2020 the share of energy coming from renewable sources reached at least 10% of their final consumption of energy in the transport sector. Biofuels are considered the main instrument to reach that percentage since the development of other alternatives (such as hydrogen and electricity) will take much longer than expected. Meanwhile, these various legislative initiatives have driven the production and consumption of biofuels in the EU. Biofuels accounted for 4.7% of EU transport fuel consumption in 2011. They have also led to trade and investment in biofuels on a global scale. This large-scale expansion of biofuels has, however, revealed numerous negative impacts. These stem from the fact that first-generation biofuels (i.e., those produced from food crops), of which the most important types are biodiesel and bioethanol, are used almost exclusively to meet the EU’s renewable 10% target in transport. Their negative impacts are: socioeconomic (food price rises), legal (land-grabbing), environmental (for instance, water stress and water pollution; soil erosion; reduction of biodiversity), climatic (direct and indirect land-use effects resulting in more greenhouse gas emissions) and public finance issues (subsidies and tax relief). The extent of such negative impacts depends on how biofuel feedstocks are produced and processed, the scale of production, and in particular, how they influence direct land use change (DLUC) and indirect land use change (ILUC) and the international trade. These negative impacts have thus provoked mounting debates in recent years, with a particular focus on ILUC. They have forced the EU to re-examine how it deals with biofuels and submit amendments to update its legislation. So far, the EU legislation foresees that only sustainable biofuels (produced in the EU or imported) can be used to meet the 10% target and receive public support; and to that end, mandatory sustainability criteria have been defined. Yet they have a huge flaw. Their measurement of greenhouse gas savings from biofuels does not take into account greenhouse gas emissions resulting from ILUC, which represent a major problem. The Energy Council of June 2014 agreed to set a limit on the extent to which firstgeneration biofuels can count towards the 10% target. But this limit appears to be less stringent than the ones made previously by the European Commission and the European Parliament. It also agreed to introduce incentives for the use of advanced (second- and third-generation) biofuels which would be allowed to count double towards the 10% target. But this again appears extremely modest by comparison with what was previously proposed. Finally, the approach chosen to take into account the greenhouse gas emissions due to ILUC appears more than cautious. The Energy Council agreed that the European Commission will carry out a reporting of ILUC emissions by using provisional estimated factors. A review clause will permit the later adjustment of these ILUC factors. With such legislative orientations made by the Energy Council, one cannot consider yet that there is a major shift in the EU biofuels policy. Bolder changes would have probably meant risking the collapse of the high-emission conventional biodiesel industry which currently makes up the majority of Europe’s biofuel production. The interests of EU farmers would have also been affected. There is nevertheless a tension between these legislative orientations and the new Commission’s proposals beyond 2020. In any case, many uncertainties remain on this issue. As long as solutions have not been found to minimize the important collateral damages provoked by the first generation biofuels, more scientific studies and caution are needed. Meanwhile, it would be wise to improve alternative paths towards a sustainable transport sector, i.e., stringent emission and energy standards for all vehicles, better public transport systems, automobiles that run on renewable energy other than biofuels, or other alternatives beyond the present imagination.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Included are 88 references on thermionic conversion of heat energy and the use of radioisotopes as power sources. References on thermoelectric conversion are included if the primary energy source is a radioisotope.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

U.S. Atomic Energy Commission Report No. TID-3561(REV. 4).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over recent years, evidence has been accumulating in favour of the importance of long-term information as a variable which can affect the success of short-term recall. Lexicality, word frequency, imagery and meaning have all been shown to augment short term recall performance. Two competing theories as to the causes of this long-term memory influence are outlined and tested in this thesis. The first approach is the order-encoding account, which ascribes the effect to the usage of resources at encoding, hypothesising that word lists which require less effort to process will benefit from increased levels of order encoding, in turn enhancing recall success. The alternative view, trace redintegration theory, suggests that order is automatically encoded phonologically, and that long-term information can only influence the interpretation of the resultant memory trace. The free recall experiments reported here attempted to determine the importance of order encoding as a facilitatory framework and to determine the locus of the effects of long-term information in free recall. Experiments 1 and 2 examined the effects of word frequency and semantic categorisation over a filled delay, and experiments 3 and 4 did the same for immediate recall. Free recall was improved by both long-term factors tested. Order information was not used over a short filled delay, but was evident in immediate recall. Furthermore, it was found that both long-term factors increased the amount of order information retained. Experiment 5 induced an order encoding effect over a filled delay, leaving a picture of short-term processes which are closely associated with long-term processes, and which fit conceptions of short-term memory being part of language processes rather better than either the encoding or the retrieval-based models. Experiments 6 and 7 aimed to determine to what extent phonological processes were responsible for the pattern of results observed. Articulatory suppression affected the encoding of order information where speech rate had no direct influence, suggesting that it is ease of lexical access which is the most important factor in the influence of long-term memory on immediate recall tasks. The evidence presented in this thesis does not offer complete support for either the retrieval-based account or the order encoding account of long-term influence. Instead, the evidence sits best with models that are based upon language-processing. The path urged for future research is to find ways in which this diffuse model can be better specified, and which can take account of the versatility of the human brain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Market mechanisms are a means by which resources in contention can be allocated between contending parties, both in human economies and those populated by software agents. Designing such mechanisms has traditionally been carried out by hand, and more recently by automation. Assessing these mechanisms typically involves them being evaluated with respect to multiple conflicting objectives, which can often be nonlinear, noisy, and expensive to compute. For typical performance objectives, it is known that designed mechanisms often fall short on being optimal across all objectives simultaneously. However, in all previous automated approaches, either only a single objective is considered, or else the multiple performance objectives are combined into a single objective. In this paper we do not aggregate objectives, instead considering a direct, novel application of multi-objective evolutionary algorithms (MOEAs) to the problem of automated mechanism design. This allows the automatic discovery of trade-offs that such objectives impose on mechanisms. We pose the problem of mechanism design, specifically for the class of linear redistribution mechanisms, as a naturally existing multi-objective optimisation problem. We apply a modified version of NSGA-II in order to design mechanisms within this class, given economically relevant objectives such as welfare and fairness. This application of NSGA-II exposes tradeoffs between objectives, revealing relationships between them that were otherwise unknown for this mechanism class. The understanding of the trade-off gained from the application of MOEAs can thus help practitioners with an insightful application of discovered mechanisms in their respective real/artificial markets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The evaluation from experimental data, of physical quantities, which enter into the electromagnetic Maxwell equations, is described as inverse optical problem. The functional relations between the dependent and independent variables are of transcendental character and numeric procedures for evaluation of the unknowns are largely used. Herein, we discuss a direct approach to the solution, illustrated by a specific example of determination of thin films optical constants from spectrophotometric data. New algorithm is proposed for the parameters evaluation, which does not need an initial guess of the unknowns and does not use iterative procedures. Thus we overcome the intrinsic deficiency of minimization techniques, such as gradient search methods, Simplex methods, etc. The price of it is a need of more computing power, but our algorithm is easily implemented in structures such as grid clusters. We show the advantages of this approach and its potential for generalization to other inverse optical problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Searches for the supersymmetric partner of the top quark (stop) are motivated by natural supersymmetry, where the stop has to be light to cancel the large radiative corrections to the Higgs boson mass. This thesis presents three different searches for the stop at √s = 8 TeV and √s = 13 TeV using data from the ATLAS experiment at CERN’s Large Hadron Collider. The thesis also includes a study of the primary vertex reconstruction performance in data and simulation at √s = 7 TeV using tt and Z events. All stop searches presented are carried out in final states with a single lepton, four or more jets and large missing transverse energy. A search for direct stop pair production is conducted with 20.3 fb−1 of data at a center-of-mass energy of √s = 8 TeV. Several stop decay scenarios are considered, including those to a top quark and the lightest neutralino and to a bottom quark and the lightest chargino. The sensitivity of the analysis is also studied in the context of various phenomenological MSSM models in which more complex decay scenarios can be present. Two different analyses are carried out at √s = 13 TeV. The first one is a search for both gluino-mediated and direct stop pair production with 3.2 fb−1 of data while the second one is a search for direct stop pair production with 13.2 fb−1 of data in the decay scenario to a bottom quark and the lightest chargino. The results of the analyses show no significant excess over the Standard Model predictions in the observed data. Consequently, exclusion limits are set at 95% CL on the masses of the stop and the lightest neutralino.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Individuals living in highly networked societies publish a large amount of personal, and potentially sensitive, information online. Web investigators can exploit such information for a variety of purposes, such as in background vetting and fraud detection. However, such investigations require a large number of expensive man hours and human effort. This paper describes InfoScout, a search tool which is intended to reduce the time it takes to identify and gather subject centric information on the Web. InfoScout collects relevance feedback information from the investigator in order to rerank search results, allowing the intended information to be discovered more quickly. Users may still direct their search as they see fit, issuing ad-hoc queries and filtering existing results by keywords. Design choices are informed by prior work and industry collaboration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The raft hypothesis proposes that microdomains enriched in sphingolipids, cholesterol, and specific proteins are transiently formed to accomplish important cellular tasks. Equivocally, detergent-resistant membranes were initially assumed to be identical to membrane rafts, because of similarities between their compositions. In fact, the impact of detergents in membrane organization is still controversial. Here, we use phase contrast and fluorescence microscopy to observe giant unilamellar vesicles (GUVs) made of erythrocyte membrane lipids (erythro-GUVs) when exposed to the detergent Triton X-100 (TX-100). We clearly show that TX-100 has a restructuring action on biomembranes. Contact with TX-100 readily induces domain formation on the previously homogeneous membrane of erythro-GUVs at physiological and room temperatures. The shape and dynamics of the formed domains point to liquid-ordered/liquid-disordered (Lo/Ld) phase separation, typically found in raft-like ternary lipid mixtures. The Ld domains are then separated from the original vesicle and completely solubilized by TX-100. The insoluble vesicle left, in the Lo phase, represents around 2/3 of the original vesicle surface at room temperature and decreases to almost 1/2 at physiological temperature. This chain of events could be entirely reproduced with biomimetic GUVs of a simple ternary lipid mixture, 2:1:2 POPC/SM/chol (phosphatidylcholine/sphyngomyelin/cholesterol), showing that this behavior will arise because of fundamental physicochemical properties of simple lipid mixtures. This work provides direct visualization of TX-100-induced domain formation followed by selective (Ld phase) solubilization in a model system with a complex biological lipid composition.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We report on the shape resonance spectra of phenol-water clusters, as obtained from elastic electron scattering calculations. Our results, along with virtual orbital analysis, indicate that the well-known indirect mechanism for hydrogen elimination in the gas phase is significantly impacted on by microsolvation, due to the competition between vibronic couplings on the solute and solvent molecules. This fact suggests how relevant the solvation effects could be for the electron-driven damage of biomolecules and the biomass delignification [E. M. de Oliveira et al., Phys. Rev. A 86, 020701(R) (2012)]. We also discuss microsolvation signatures in the differential cross sections that could help to identify the solvated complexes and access the composition of gaseous admixtures of these species.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Excessive occlusal surface wear can result in occlusal disharmony, functional and esthetic impairment. As a therapeutic approach, conventional single crowns have been proposed, but this kind of treatment is complex, highly invasive and expensive. This case report describes the clinical outcomes of an alternative minimally invasive treatment based on direct adhesive-pin retained restorations. A 64-year-old woman with severely worn dentition, eating problems related to missing teeth and generalized tooth hypersensitivity was referred for treatment. Proper treatment planning based on the diagnostic wax-up simulation was used to guide the reconstruction of maxillary anterior teeth with direct composite resin over self-threading dentin pins. As the mandibular remaining teeth were extremely worn, a tooth-supported overdenture was installed. A stabilization splint was also used to protect the restorations. This treatment was a less expensive alternative to full-mouth rehabilitation with positive esthetic and functional outcomes after 1.5 years of follow-up.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using a desorption/ionization technique, easy ambient sonic-spray ionization coupled to mass spectrometry (EASI-MS), documents related to the 2nd generation of Brazilian Real currency (R$) were screened in the positive ion mode for authenticity based on chemical profiles obtained directly from the banknote surface. Characteristic profiles were observed for authentic, seized suspect counterfeit and counterfeited homemade banknotes from inkjet and laserjet printers. The chemicals in the authentic banknotes' surface were detected via a few minor sets of ions, namely from the plasticizers bis(2-ethylhexyl)phthalate (DEHP) and dibutyl phthalate (DBP), most likely related to the official offset printing process, and other common quaternary ammonium cations, presenting a similar chemical profile to 1st-generation R$. The seized suspect counterfeit banknotes, however, displayed abundant diagnostic ions in the m/z 400-800 range due to the presence of oligomers. High-accuracy FT-ICR MS analysis enabled molecular formula assignment for each ion. The ions were separated by 44 m/z, which enabled their characterization as Surfynol® 4XX (S4XX, XX=40, 65, and 85), wherein increasing XX values indicate increasing amounts of ethoxylation on a backbone of 2,4,7,9-tetramethyl-5-decyne-4,7-diol (Surfynol® 104). Sodiated triethylene glycol monobutyl ether (TBG) of m/z 229 (C10H22O4Na) was also identified in the seized counterfeit banknotes via EASI(+) FT-ICR MS. Surfynol® and TBG are constituents of inks used for inkjet printing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

X-ray fluorescence (XRF) is a fast, low-cost, nondestructive, and truly multielement analytical technique. The objectives of this study are to quantify the amount of Na(+) and K(+) in samples of table salt (refined, marine, and light) and to compare three different methodologies of quantification using XRF. A fundamental parameter method revealed difficulties in quantifying accurately lighter elements (Z < 22). A univariate methodology based on peak area calibration is an attractive alternative, even though additional steps of data manipulation might consume some time. Quantifications were performed with good correlations for both Na (r = 0.974) and K (r = 0.992). A partial least-squares (PLS) regression method with five latent variables was very fast. Na(+) quantifications provided calibration errors lower than 16% and a correlation of 0.995. Of great concern was the observation of high Na(+) levels in low-sodium salts. The presented application may be performed in a fast and multielement fashion, in accordance with Green Chemistry specifications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Universidade Estadual de Campinas . Faculdade de Educação Física