917 resultados para MODEL SEARCH
Resumo:
This Letter presents the first search for supersymmetry in final states containing one isolated electron or muon, jets, and missing transverse momentum from √s=7 TeV proton-proton collisions at the LHC. The data were recorded by the ATLAS experiment during 2010 and correspond to a total integrated luminosity of 35 pb(-1). No excess above the standard model background expectation is observed. Limits are set on the parameters of the minimal supergravity framework, extending previous limits. Within this framework, for A(0)=0 GeV, tanβ=3, and μ>0 and for equal squark and gluino masses, gluino masses below 700 GeV are excluded at 95% confidence level.
Resumo:
A search for diphoton events with large missing transverse energy is presented. The data were collected with the ATLAS detector in proton-proton collisions at √s=7 TeV at the CERN Large Hadron Collider and correspond to an integrated luminosity of 3.1 pb⁻¹. No excess of such events is observed above the standard model background prediction. In the context of a specific model with one universal extra dimension with compactification radius R and gravity-induced decays, values of 1/R<729 GeV are excluded at 95% C. L., providing the most sensitive limit on this model to date.
Resumo:
A search for the Higgs boson has been performed in the H-->WW(*)-->l(+)nul(-)nu[over ] channel (l=e/mu) with an integrated luminosity of 2.05 fb(-1) of pp collisions at radicals=7 TeV collected with the ATLAS detector at the Large Hadron Collider. No significant excess of events over the expected background is observed and limits on the Higgs boson production cross section are derived for a Higgs boson mass in the range 110 GeV
Resumo:
This Letter presents the results of a search for a heavy particle decaying into an e(+/-)mu(+/-), e(+/-)tau(+/-), or mu(+/-)tau(+/-) final state in pp collisions at root s = 7 TeV. The data were recorded with the ATLAS detector at the LHC during 2011 and correspond to an integrated luminosity of 4.6 fb(-1). No significant excess above the Standard Model expectation is observed, and exclusions at 95% confidence level are placed on the cross section times branching ratio for the production of an R-parity-violating supersymmetric tau sneutrino. For a sneutrino mass of 500 (2000) GeV, the observed limits on the production cross section times branching ratio are 3.2 (1.4) fb, 42 (17) fb, and 40 (18) fb for the e mu, e tau, and mu tau modes, respectively. These results considerably extend constraints from Tevatron experiments.
Resumo:
Allelic variants of the human P-glycoprotein encoding gene MDR1 (ABCB1) are discussed to be associated with different clinical conditions including pharmacoresistance of epilepsy. However, conflicting data have been reported with regard to the functional relevance of MDR1 allelic variants for the response to antiepileptic drugs. To our knowledge, it is not known whether functionally relevant genetic polymorphisms also occur in the two genes (Mdr1a/Abcb1a, Mdr1b/Abcb1b) coding for P-glycoprotein in the brain of rodents. Therefore, we have started to search for polymorphisms in the Mdr1a gene, which governs the expression of P-glycoprotein in brain capillary endothelial cells in rats. In the kindling model of temporal lobe epilepsy, subgroups of phenytoin-sensitive and phenytoin-resistant rats were selected in repeated drug trials. Sequencing of the Mdr1a gene coding sequence in the subgroups revealed no general differences between drug-resistant and drug-sensitive rats of the Wistar outbred strain. A comparison between different inbred and outbred rat strains also gave no evidence for polymorphisms in the Mdr1a coding sequence. However, in exon-flanking intron sequences, four genetic variants were identified by comparison between these rats strains. In conclusion, the finding that Wistar rats vary in their response to phenytoin, while having the same genetic background, argues against a major impact of Mdr1a genetics on pharmacosensitivity to antiepileptic drugs in the amygdala kindling model.
Resumo:
This paper describes the Model for Outcome Classification in Health Promotion and Prevention adopted by Health Promotion Switzerland (SMOC, Swiss Model for Outcome Classification) and the process of its development. The context and method of model development, and the aim and objectives of the model are outlined. Preliminary experience with application of the model in evaluation planning and situation analysis is reported. On the basis of an extensive literature search, the model is situated within the wider international context of similar efforts to meet the challenge of developing tools to assess systematically the activities of health promotion and prevention.
Resumo:
Multiplication of bacteria within the central nervous system compartment triggers a host response with an overshooting inflammatory reaction which leads to brain parenchyma damage. Some of the inflammatory and neurotoxic mediators involved in the processes leading to neuronal injury during bacterial meningitis have been identified in recent years. As a result, the therapeutic approach to the disease has widened from eradication of the bacterial pathogen with antibiotics to attenuation of the detrimental effects of host defences. Corticosteroids represent an example of the adjuvant therapeutic strategies aimed at downmodulating excessive inflammation in the infected central nervous system. Pathophysiological concepts derived from an experimental rat model of bacterial meningitis revealed possible therapeutic strategies for prevention of brain damage. The insights gained led to the evaluation of new therapeutic modalities such as anticytokine agents, matrix metalloproteinase inhibitors, antioxidants, and antagonists of endothelin and glutamate. Bacterial meningitis is still associated with persistent neurological sequelae in approximately one third of surviving patients. Future research in the model will evaluate whether the neuroprotective agents identified so far have the potential to attenuate learning disabilities as a long-term consequence of bacterial meningitis.
Resumo:
Ventral mesencephalic (VM) precursor cells are of interest in the search for transplantable dopaminergic neurons for cell therapy in Parkinson's disease (PD). In the present study we investigated the survival and functional capacity of in vitro expanded, primary VM precursor cells after intrastriatal grafting to a rat model of PD. Embryonic day 12 rat VM tissue was mechanically dissociated and cultured for 4 or 8 days in vitro (DIV) in the presence of FGF2 (20 ng/ml), FGF8 (20 ng/ml) or without mitogens (control). Cells were thereafter differentiated for 6 DIV by mitogen withdrawal and addition of serum. After differentiation, significantly more tyrosine hydroxylase-immunoreactive (TH-ir), dopamine-producing neurons were found in FGF2- and FGF8-expanded cultures compared to controls. Moreover, expansion for 4 DIV resulted in significantly more TH-ir cells than expansion for 8 DIV both for FGF2 (2.4 fold; P<0.001) and FGF8 (3.8 fold; P<0.001) treated cultures. The functional potential of the expanded cells (4 DIV) was examined after grafting into striatum of aged 6-hydroxydopamine-lesioned rats. Amphetamine-induced rotations performed 3, 6 and 9 weeks postgrafting revealed that grafts of FGF2-expanded cells induced a significantly faster and better functional recovery than grafts of FGF8-expanded cells or control cells (P<0.05 for both). Grafts of FGF2-expanded cells also contained significantly more TH-ir cells than grafts of FGF8-expanded cells (P<0.05) or control cells (P<0.01). In conclusion, FGF2-mediated pregrafting expansion of primary VM precursor cells considerably improves dopaminergic cell survival and functional restoration in a rat model of PD.
Resumo:
Wind energy has been one of the most growing sectors of the nation’s renewable energy portfolio for the past decade, and the same tendency is being projected for the upcoming years given the aggressive governmental policies for the reduction of fossil fuel dependency. Great technological expectation and outstanding commercial penetration has shown the so called Horizontal Axis Wind Turbines (HAWT) technologies. Given its great acceptance, size evolution of wind turbines over time has increased exponentially. However, safety and economical concerns have emerged as a result of the newly design tendencies for massive scale wind turbine structures presenting high slenderness ratios and complex shapes, typically located in remote areas (e.g. offshore wind farms). In this regard, safety operation requires not only having first-hand information regarding actual structural dynamic conditions under aerodynamic action, but also a deep understanding of the environmental factors in which these multibody rotating structures operate. Given the cyclo-stochastic patterns of the wind loading exerting pressure on a HAWT, a probabilistic framework is appropriate to characterize the risk of failure in terms of resistance and serviceability conditions, at any given time. Furthermore, sources of uncertainty such as material imperfections, buffeting and flutter, aeroelastic damping, gyroscopic effects, turbulence, among others, have pleaded for the use of a more sophisticated mathematical framework that could properly handle all these sources of indetermination. The attainable modeling complexity that arises as a result of these characterizations demands a data-driven experimental validation methodology to calibrate and corroborate the model. For this aim, System Identification (SI) techniques offer a spectrum of well-established numerical methods appropriated for stationary, deterministic, and data-driven numerical schemes, capable of predicting actual dynamic states (eigenrealizations) of traditional time-invariant dynamic systems. As a consequence, it is proposed a modified data-driven SI metric based on the so called Subspace Realization Theory, now adapted for stochastic non-stationary and timevarying systems, as is the case of HAWT’s complex aerodynamics. Simultaneously, this investigation explores the characterization of the turbine loading and response envelopes for critical failure modes of the structural components the wind turbine is made of. In the long run, both aerodynamic framework (theoretical model) and system identification (experimental model) will be merged in a numerical engine formulated as a search algorithm for model updating, also known as Adaptive Simulated Annealing (ASA) process. This iterative engine is based on a set of function minimizations computed by a metric called Modal Assurance Criterion (MAC). In summary, the Thesis is composed of four major parts: (1) development of an analytical aerodynamic framework that predicts interacted wind-structure stochastic loads on wind turbine components; (2) development of a novel tapered-swept-corved Spinning Finite Element (SFE) that includes dampedgyroscopic effects and axial-flexural-torsional coupling; (3) a novel data-driven structural health monitoring (SHM) algorithm via stochastic subspace identification methods; and (4) a numerical search (optimization) engine based on ASA and MAC capable of updating the SFE aerodynamic model.
Resumo:
The three-step test is central to the regulation of copyright limitations at the international level. Delineating the room for exemptions with abstract criteria, the three-step test is by far the most important and comprehensive basis for the introduction of national use privileges. It is an essential, flexible element in the international limitation infrastructure that allows national law makers to satisfy domestic social, cultural, and economic needs. Given the universal field of application that follows from the test’s open-ended wording, the provision creates much more breathing space than the more specific exceptions recognized in international copyright law. EC copyright legislation, however, fails to take advantage of the flexibility inherent in the three-step test. Instead of using the international provision as a means to open up the closed EC catalogue of permissible exceptions, offer sufficient breathing space for social, cultural, and economic needs, and enable EC copyright law to keep pace with the rapid development of the Internet, the Copyright Directive 2001/29/EC encourages the application of the three-step test to further restrict statutory exceptions that are often defined narrowly in national legislation anyway. In the current online environment, however, enhanced flexibility in the field of copyright limitations is indispensable. From a social and cultural perspective, the web 2.0 promotes and enhances freedom of expression and information with its advanced search engine services, interactive platforms, and various forms of user-generated content. From an economic perspective, it creates a parallel universe of traditional content providers relying on copyright protection, and emerging Internet industries whose further development depends on robust copyright limita- tions. In particular, the newcomers in the online market – social networking sites, video forums, and virtual worlds – promise a remarkable potential for economic growth that has already attracted the attention of the OECD. Against this background, the time is ripe to debate the introduction of an EC fair use doctrine on the basis of the three-step test. Otherwise, EC copyright law is likely to frustrate important opportunities for cultural, social, and economic development. To lay groundwork for the debate, the differences between the continental European and the Anglo-American approach to copyright limitations (section 1), and the specific merits of these two distinct approaches (section 2), will be discussed first. An analysis of current problems that have arisen under the present dysfunctional EC system (section 3) will then serve as a starting point for proposing an EC fair use doctrine based on the three-step test (section 4). Drawing conclusions, the international dimension of this fair use proposal will be considered (section 5).
Resumo:
With a steady increase of regulatory requirements for business processes, automation support of compliance management is a field garnering increasing attention in Information Systems research. Several approaches have been developed to support compliance checking of process models. One major challenge for such approaches is their ability to handle different modeling techniques and compliance rules in order to enable widespread adoption and application. Applying a structured literature search strategy, we reflect and discuss compliance-checking approaches in order to provide an insight into their generalizability and evaluation. The results imply that current approaches mainly focus on special modeling techniques and/or a restricted set of types of compliance rules. Most approaches abstain from real-world evaluation which raises the question of their practical applicability. Referring to the search results, we propose a roadmap for further research in model-based business process compliance checking.
Resumo:
PURPOSE Segmentation of the proximal femur in digital antero-posterior (AP) pelvic radiographs is required to create a three-dimensional model of the hip joint for use in planning and treatment. However, manually extracting the femoral contour is tedious and prone to subjective bias, while automatic segmentation must accommodate poor image quality, anatomical structure overlap, and femur deformity. A new method was developed for femur segmentation in AP pelvic radiographs. METHODS Using manual annotations on 100 AP pelvic radiographs, a statistical shape model (SSM) and a statistical appearance model (SAM) of the femur contour were constructed. The SSM and SAM were used to segment new AP pelvic radiographs with a three-stage approach. At initialization, the mean SSM model is coarsely registered to the femur in the AP radiograph through a scaled rigid registration. Mahalanobis distance defined on the SAM is employed as the search criteria for each annotated suggested landmark location. Dynamic programming was used to eliminate ambiguities. After all landmarks are assigned, a regularized non-rigid registration method deforms the current mean shape of SSM to produce a new segmentation of proximal femur. The second and third stages are iteratively executed to convergence. RESULTS A set of 100 clinical AP pelvic radiographs (not used for training) were evaluated. The mean segmentation error was [Formula: see text], requiring [Formula: see text] s per case when implemented with Matlab. The influence of the initialization on segmentation results was tested by six clinicians, demonstrating no significance difference. CONCLUSIONS A fast, robust and accurate method for femur segmentation in digital AP pelvic radiographs was developed by combining SSM and SAM with dynamic programming. This method can be extended to segmentation of other bony structures such as the pelvis.
Resumo:
People often use tools to search for information. In order to improve the quality of an information search, it is important to understand how internal information, which is stored in user’s mind, and external information, represented by the interface of tools interact with each other. How information is distributed between internal and external representations significantly affects information search performance. However, few studies have examined the relationship between types of interface and types of search task in the context of information search. For a distributed information search task, how data are distributed, represented, and formatted significantly affects the user search performance in terms of response time and accuracy. Guided by UFuRT (User, Function, Representation, Task), a human-centered process, I propose a search model, task taxonomy. The model defines its relationship with other existing information models. The taxonomy clarifies the legitimate operations for each type of search task of relation data. Based on the model and taxonomy, I have also developed prototypes of interface for the search tasks of relational data. These prototypes were used for experiments. The experiments described in this study are of a within-subject design with a sample of 24 participants recruited from the graduate schools located in the Texas Medical Center. Participants performed one-dimensional nominal search tasks over nominal, ordinal, and ratio displays, and searched one-dimensional nominal, ordinal, interval, and ratio tasks over table and graph displays. Participants also performed the same task and display combination for twodimensional searches. Distributed cognition theory has been adopted as a theoretical framework for analyzing and predicting the search performance of relational data. It has been shown that the representation dimensions and data scales, as well as the search task types, are main factors in determining search efficiency and effectiveness. In particular, the more external representations used, the better search task performance, and the results suggest the ideal search performance occurs when the question type and corresponding data scale representation match. The implications of the study lie in contributing to the effective design of search interface for relational data, especially laboratory results, which are often used in healthcare activities.
Resumo:
A first result of the search for ν ( )μ( ) → ν ( )e( ) oscillations in the OPERA experiment, located at the Gran Sasso Underground Laboratory, is presented. The experiment looked for the appearance of ν ( )e( ) in the CNGS neutrino beam using the data collected in 2008 and 2009. Data are compatible with the non-oscillation hypothesis in the three-flavour mixing model. A further analysis of the same data constrains the non-standard oscillation parameters θ (new) and suggested by the LSND and MiniBooNE experiments. For large values (>0.1 eV(2)), the OPERA 90% C.L. upper limit on sin(2)(2θ (new)) based on a Bayesian statistical method reaches the value 7.2 × 10(−3).
Resumo:
This Letter describes a model-independent search for the production of new resonances in photon + jet events using 20 inverse fb of proton--proton LHC data recorded with the ATLAS detector at a centre-of-mass energy of s√ = 8 TeV. The photon + jet mass distribution is compared to a background model fit from data; no significant deviation from the background-only hypothesis is found. Limits are set at 95% credibility level on generic Gaussian-shaped signals and two benchmark phenomena beyond the Standard Model: non-thermal quantum black holes and excited quarks. Non-thermal quantum black holes are excluded below masses of 4.6 TeV and excited quarks are excluded below masses of 3.5 TeV.