982 resultados para Direct search


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we search for the regions of the phenomenological minimal supersymmetric standard model (pMSSM) parameter space where one can expect to have moderate Higgs mixing angle (alpha) with relatively light (up to 600 GeV) additional Higgses after satisfying the current LHC data. We perform a global fit analysis using most updated data (till December 2014) from the LHC and Tevatron experiments. The constraints coming from the precision measurements of the rare b-decays B-s -> mu(+)mu(-) and b -> s gamma are also considered. We find that low M-A(less than or similar to 350) and high tan beta(greater than or similar to 25) regions are disfavored by the combined effect of the global analysis and flavor data. However, regions with Higgs mixing angle alpha similar to 0.1-0.8 are still allowed by the current data. We then study the existing direct search bounds on the heavy scalar/pseudoscalar (H/A) and charged Higgs boson (H-+/-) masses and branchings at the LHC. It has been found that regions with low to moderate values of tan beta with light additional Higgses (mass <= 600 GeV) are unconstrained by the data, while the regions with tan beta > 20 are excluded considering the direct search bounds by the LHC-8 data. The possibility to probe the region with tan beta <= 20 at the high luminosity run of LHC are also discussed, giving special attention to the H -> hh, H/A -> t (t) over bar and H/A -> tau(+)tau(-) decay modes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Constrained nonlinear optimization problems are usually solved using penalty or barrier methods combined with unconstrained optimization methods. Another alternative used to solve constrained nonlinear optimization problems is the lters method. Filters method, introduced by Fletcher and Ley er in 2002, have been widely used in several areas of constrained nonlinear optimization. These methods treat optimization problem as bi-objective attempts to minimize the objective function and a continuous function that aggregates the constraint violation functions. Audet and Dennis have presented the rst lters method for derivative-free nonlinear programming, based on pattern search methods. Motivated by this work we have de- veloped a new direct search method, based on simplex methods, for general constrained optimization, that combines the features of the simplex method and lters method. This work presents a new variant of these methods which combines the lters method with other direct search methods and are proposed some alternatives to aggregate the constraint violation functions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In Nonlinear Optimization Penalty and Barrier Methods are normally used to solve Constrained Problems. There are several Penalty/Barrier Methods and they are used in several areas from Engineering to Economy, through Biology, Chemistry, Physics among others. In these areas it often appears Optimization Problems in which the involved functions (objective and constraints) are non-smooth and/or their derivatives are not know. In this work some Penalty/Barrier functions are tested and compared, using in the internal process, Derivative-free, namely Direct Search, methods. This work is a part of a bigger project involving the development of an Application Programming Interface, that implements several Optimization Methods, to be used in applications that need to solve constrained and/or unconstrained Nonlinear Optimization Problems. Besides the use of it in applied mathematics research it is also to be used in engineering software packages.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Strategic searching for invasive pests presents a formidable challenge for conservation managers. Limited funding can necessitate choosing between surveying many sites cursorily, or focussing intensively on fewer sites. While existing knowledge may help to target more likely sites, e.g. with species distribution models (maps), this knowledge is not flawless and improving it also requires management investment. 2.In a rare example of trading-off action against knowledge gain, we combine search coverage and accuracy, and its future improvement, within a single optimisation framework. More specifically we examine under which circumstances managers should adopt one of two search-and-control strategies (cursory or focussed), and when they should divert funding to improving knowledge, making better predictive maps that benefit future searches. 3.We use a family of Receiver Operating Characteristic curves to reflect the quality of maps that direct search efforts. We demonstrate our framework by linking these to a logistic model of invasive spread such as that for the red imported fire ant Solenopsis invicta in south-east Queensland, Australia. 4.Cursory widespread searching is only optimal if the pest is already widespread or knowledge is poor, otherwise focussed searching exploiting the map is preferable. For longer management timeframes, eradication is more likely if funds are initially devoted to improving knowledge, even if this results in a short-term explosion of the pest population. 5.Synthesis and applications. By combining trade-offs between knowledge acquisition and utilization, managers can better focus - and justify - their spending to achieve optimal results in invasive control efforts. This framework can improve the efficiency of any ecological management that relies on predicting occurrence. © 2010 The Authors. Journal of Applied Ecology © 2010 British Ecological Society.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Time-series discord is widely used in data mining applications to characterize anomalous subsequences in time series. Compared to some other discord search algorithms, the direct search algorithm based on the recurrence plot shows the advantage of being fast and parameter free. The direct search algorithm, however, relies on quasi-periodicity in input time series, an assumption that limits the algorithm's applicability. In this paper, we eliminate the periodicity assumption from the direct search algorithm by proposing a reference function for subsequences and a new sampling strategy based on the reference function. These measures result in a new algorithm with improved efficiency and robustness, as evidenced by our empirical evaluation.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

We report the first direct search for the Kaluza-Klein (KK) modes of Randall-Sundrum gravitons using dielectron, dimuon, and diphoton events observed with the D0 detector operating at the Fermilab Tevatron p (p) over bar Collider at root s = 1: 96 TeV. No evidence for resonant production of gravitons has been found in the data corresponding to an integrated luminosity of approximate to 260 pb(-1). Lower limits on the mass of the first KK mode at the 95% C. L. have been set between 250 and 785 GeV, depending on its coupling to standard model particles.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Long-lived, heavy particles are predicted in a number of models beyond the standard model of particle physics. We present the first direct search for such particles' decays, occurring up to 100 h after their production and not synchronized with an accelerator bunch crossing. We apply the analysis to the gluino (g), predicted in split supersymmetry, which after hadronization can become charged and lose enough momentum through ionization to come to rest in dense particle detectors. Approximately 410 pb(-1) of p (p) over bar collisions at root s = 1.96 TeV collected with the D0 detector during Run II of the Fermilab Tevatron collider are analyzed in search of such stopped gluinos decaying into a gluon and a neutralino ((chi) over tilde (0)(1)). Limits are placed on the (gluino cross section) x (probability to stop) x [BR((g) over tilde -> g (chi) over tilde (0)(1))] as a function of the gluino and (chi) over tilde (0)(1) masses, for gluino lifetimes from 30 mu s-100 h.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Im Jahr 2011 wurde am Large Hadron Collider mit dem ATLAS Experiment ein Datensatz von 4.7 inversen Femtobarn bei einer Schwerpunktsenergie von 7 TeV aufgezeichnet. Teil des umfangreichen Physikprogrammes des ATLAS Experiments ist die Suche nach Physik jenseits des Standardmodells. Supersymmetrie - eine neue Symmetrie zwischen Bosonen und Fermionen - wird als aussichtsreichester Kandidat für neue Physik angesehen, und zahlreiche direkte und indirekte Suchen nach Supersymmetrie wurden in den letzten Jahrzehnten bereits durchgeführt. In der folgenden Arbeit wird eine direkte Suche nach Supersymmetrie in Endzuständen mit Jets, fehlender Transversalenergie und genau einem Elektron oder Myon durchgeführt. Der analysierte Datensatz von 4.7 inversen Femtobarn umfasst die gesamte Datenmenge, welche am ATLAS Experiment bei einer Schwerpunktsenergie von 7 TeV aufgezeichnet wurde. Die Ergebnisse der Analyse werden mit verschiedenen anderen leptonischen Suchkanälen kombiniert, um die Sensitivität auf diversen supersymmetrischen Produktions- und Zerfallsmodi zu maximieren. Die gemessenen Daten sind kompatibel mit der Standardmodellerwartung, und neue Ausschlussgrenzen in verschiedenen supersymmetrischen Modellen werden berechnet.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Damage localization induced by strain softening can be predicted by the direct minimization of a global energy function. This article concerns the computational strategy for implementing this principle for softening materials such as concrete. Instead of using heuristic global optimization techniques, our strategies are a hybrid of local optimization methods with a path-finding approach to ensure a global optimum. With admissible nodal displacements being independent variables, it is easy to deal with the geometric (mesh) constraint conditions. The direct search optimization methods recover the localized solutions for a range of softening lattice models which are representative of quasi-brittle structures

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, load profile and operational goal are used to find optimal sizing of combined PV-energy storage for a future grid-connected residential building. As part of this approach, five operational goals are introduced and the annual cost for each operation goal has been assessed. Finally, the optimal sizing for combined PV-energy storage has been determined, using direct search method. In addition, sensitivity of the annual cost to different parameters has been analyzed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper presents an efficient algorithm for optimizing the operation of battery storage in a low voltage distribution network with a high penetration of PV generation. A predictive control solution is presented that uses wavelet neural networks to predict the load and PV generation at hourly intervals for twelve hours into the future. The load and generation forecast, and the previous twelve hours of load and generation history, is used to assemble load profile. A diurnal charging profile can be compactly represented by a vector of Fourier coefficients allowing a direct search optimization algorithm to be applied. The optimal profile is updated hourly allowing the state of charge profile to respond to changing forecasts in load.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

E. coli does chemotaxis by performing a biased random walk composed of alternating periods of swimming (runs) and reorientations (tumbles). Tumbles are typically modelled as complete directional randomisations but it is known that in wild type E. coli, successive run directions are actually weakly correlated, with a mean directional difference of ∼63°. We recently presented a model of the evolution of chemotactic swimming strategies in bacteria which is able to quantitatively reproduce the emergence of this correlation. The agreement between model and experiments suggests that directional persistence may serve some function, a hypothesis supported by the results of an earlier model. Here we investigate the effect of persistence on chemotactic efficiency, using a spatial Monte Carlo model of bacterial swimming in a gradient, combined with simulations of natural selection based on chemotactic efficiency. A direct search of the parameter space reveals two attractant gradient regimes, (a) a low-gradient regime, in which efficiency is unaffected by directional persistence and (b) a high-gradient regime, in which persistence can improve chemotactic efficiency. The value of the persistence parameter that maximises this effect corresponds very closely with the value observed experimentally. This result is matched by independent simulations of the evolution of directional memory in a population of model bacteria, which also predict the emergence of persistence in high-gradient conditions. The relationship between optimality and persistence in different environments may reflect a universal property of random-walk foraging algorithms, which must strike a compromise between two competing aims: exploration and exploitation. We also present a new graphical way to generally illustrate the evolution of a particular trait in a population, in terms of variations in an evolvable parameter.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The role of renewable energy in power systems is becoming more significant due to the increasing cost of fossil fuels and climate change concerns. However, the inclusion of Renewable Energy Generators (REG), such as wind power, has created additional problems for power system operators due to the variability and lower predictability of output of most REGs, with the Economic Dispatch (ED) problem being particularly difficult to resolve. In previous papers we had reported on the inclusion of wind power in the ED calculations. The simulation had been performed using a system model with wind power as an intermittent source, and the results of the simulation have been compared to that of the Direct Search Method (DSM) for similar cases. In this paper we report on our continuing investigations into using Genetic Algorithms (GA) for ED for an independent power system with a significant amount of wind energy in its generator portfolio. The results demonstrate, in line with previous reports in the literature, the effectiveness of GA when measured against a benchmark technique such as DSM.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A new approach to spectroscopy of laser induced proton beams using radiochromic film (RCF) is presented. This approach allows primary standards of absorbed dose-to-water as used in radiotherapy to be transferred to the calibration of GafChromic HD-810 and EBT in a 29 MeV proton beam from the Birmingham cyclotron. These films were then irradiated in a common stack configuration using the TARANIS Nd:Glass multi-terawatt laser at Queens University Belfast, which can accelerate protons to 10-12 MeV, and a depth-dose curve was measured from a collimated beam. Previous work characterizing the relative effectiveness (RE) of GafChromic film as a function of energy was implemented into Monte Carlo depth-dose curves using FLUKA. A Bragg peak (BP) "library" for proton energies 0-15 MeV was generated, both with and without the RE function. These depth-response curves were iteratively summed in a FORTRAN routine to solve for the measured RCF depth-dose using a simple direct search algorithm. By comparing resultant spectra with both BP libraries, it was found that the effect of including the RE function accounted for an increase in the total number of protons by about 50%. To account for the energy loss due to a 20 mu m aluminum filter in front of the film stack, FLUKA was used to create a matrix containing the energy loss transformations for each individual energy bin. Multiplication by the pseudo-inverse of this matrix resulted in "up-shifting" protons to higher energies. Applying this correction to two laser shots gave further increases in the total number of protons, N of 31% and 56%. Failure to consider the relative response of RCF to lower proton energies and neglecting energy losses in a stack filter foil can potentially lead to significant underestimates of the total number of protons in RCF spectroscopy of the low energy protons produced by laser ablation of thin targets.