23 resultados para Critical coupling parameter
em Universidade do Minho
Resumo:
This paper presents a search for Higgs bosons decaying to four leptons, either electrons or muons, via one or two light exotic gauge bosons Zd, H→ZZd→4ℓ or H→ZdZd→4ℓ. The search was performed using pp collision data corresponding to an integrated luminosity of about 20 fb−1 at the center-of-mass energy of s√=8TeV recorded with the ATLAS detector at the Large Hadron Collider. The observed data are well described by the Standard Model prediction. Upper bounds on the branching ratio of H→ZZd→4ℓ and on the kinetic mixing parameter between the Zd and the Standard Model hypercharge gauge boson are set in the range (1--9)×10−5 and (4--17)×10−2 respectively, at 95% confidence level assuming the Standard Model branching ratio of H→ZZ∗→4ℓ, for Zd masses between 15 and 55 GeV. Upper bounds on the effective mass mixing parameter between the Z and the Zd are also set using the branching ratio limits in the H→ZZd→4ℓ search, and are in the range (1.5--8.7)×10−4 for 15
Resumo:
The moisture content in concrete structures has an important influence in their behavior and performance. Several vali-dated numerical approaches adopt the governing equation for relative humidity fields proposed in Model Code 1990/2010. Nevertheless there is no integrative study which addresses the choice of parameters for the simulation of the humidity diffusion phenomenon, particularly in concern to the range of parameters forwarded by Model Code 1990/2010. A software based on a Finite Difference Method Algorithm (1D and axisymmetric cases) is used to perform sensitivity analyses on the main parameters in a normal strength concrete. Then, based on the conclusions of the sensi-tivity analyses, experimental results from nine different concrete compositions are analyzed. The software is used to identify the main material parameters that better fit the experimental data. In general, the model was able to satisfactory fit the experimental results and new correlations were proposed, particularly focusing on the boundary transfer coeffi-cient.
Resumo:
O presente artigo discute a metodologia de um novo modelo para calcular a resistência à punção simétrica de lajes de concreto reforçado com fibras de aço (CRFA). O modelo é fundamentado na teoria da fissura crítica de cisalhamento de Muttoni e seus coautores e na proposta do ModelCode10 para simular o comportamento pós-fissuração do CRFA. O desempenho do modelo é avaliado a partir de um banco de dados (BD), coletado da literatura técnica, que totaliza 154 lajes. Os resultados são avaliados em função da precisão, da dispersão e do nível de conservadorismo, a partir do parâmetro λ=Vexp/Vteo, sendo Vexp e Vteo, respectivamente, os resultados obtidos do BD e do modelo. Finalmente, para confirmar o desempenho do modelo, os seus resultados são comparados a outros 7 modelos da literatura técnica e ambos são classificados segundo o critério modificado de Collins, o Demerit Points Classifications – DPC.
Resumo:
The Childhood protection is a subject with high value for the society, but, the Child Abuse cases are difficult to identify. The process from suspicious to accusation is very difficult to achieve. It must configure very strong evidences. Typically, Health Care services deal with these cases from the beginning where there are evidences based on the diagnosis, but they aren’t enough to promote the accusation. Besides that, this subject it’s highly sensitive because there are legal aspects to deal with such as: the patient privacy, paternity issues, medical confidentiality, among others. We propose a Child Abuses critical knowledge monitor system model that addresses this problem. This decision support system is implemented with a multiple scientific domains: to capture of tokens from clinical documents from multiple sources; a topic model approach to identify the topics of the documents; knowledge management through the use of ontologies to support the critical knowledge sensibility concepts and relations such as: symptoms, behaviors, among other evidences in order to match with the topics inferred from the clinical documents and then alert and log when clinical evidences are present. Based on these alerts clinical personnel could analyze the situation and take the appropriate procedures.
Resumo:
Doctoral Thesis Civil Engineering
Resumo:
A new very high-order finite volume method to solve problems with harmonic and biharmonic operators for one- dimensional geometries is proposed. The main ingredient is polynomial reconstruction based on local interpolations of mean values providing accurate approximations of the solution up to the sixth-order accuracy. First developed with the harmonic operator, an extension for the biharmonic operator is obtained, which allows designing a very high-order finite volume scheme where the solution is obtained by solving a matrix-free problem. An application in elasticity coupling the two operators is presented. We consider a beam subject to a combination of tensile and bending loads, where the main goal is the stress critical point determination for an intramedullary nail.
Resumo:
A search is performed for Higgs bosons produced in association with top quarks using the diphoton decay mode of the Higgs boson. Selection requirements are optimized separately for leptonic and fully hadronic final states from the top quark decays. The dataset used corresponds to an integrated luminosity of 4.5 fb−1 of proton--proton collisions at a center-of-mass energy of 7 TeV and 20.3 fb−1 at 8 TeV recorded by the ATLAS detector at the CERN Large Hadron Collider. No significant excess over the background prediction is observed and upper limits are set on the tt¯H production cross section. The observed exclusion upper limit at 95% confidence level is 6.7 times the predicted Standard Model cross section value. In addition, limits are set on the strength of the Yukawa coupling between the top quark and the Higgs boson, taking into account the dependence of the tt¯H and tH cross sections as well as the H→γγ branching fraction on the Yukawa coupling. Lower and upper limits at 95% confidence level are set at −1.3 and +8.0 times the Yukawa coupling strength in the Standard Model.
Resumo:
Thermodynamic stability of black holes, described by the Rényi formula as equilibrium compatible entropy function, is investigated. It is shown that within this approach, asymptotically flat, Schwarzschild black holes can be in stable equilibrium with thermal radiation at a fixed temperature. This implies that the canonical ensemble exists just like in anti-de Sitter space, and nonextensive effects can stabilize the black holes in a very similar way as it is done by the gravitational potential of an anti-de Sitter space. Furthermore, it is also shown that a Hawking–Page-like black hole phase transition occurs at a critical temperature which depends on the q-parameter of the Rényi formula.
Resumo:
Results of a search for H→ττ decays are presented, based on the full set of proton--proton collision data recorded by the ATLAS experiment at the LHC during 2011 and 2012. The data correspond to integrated luminosities of 4.5 fb−1 and 20.3 fb−1 at centre-of-mass energies of s√ = 7 TeV and s√ = 8 TeV respectively. All combinations of leptonic (τ→ℓνν¯ with ℓ=e,μ) and hadronic (τ→hadrons ν) tau decays are considered. An excess of events over the expected background from other Standard Model processes is found with an observed (expected) significance of 4.5 (3.4) standard deviations. This excess provides evidence for the direct coupling of the recently discovered Higgs boson to fermions. The measured signal strength, normalised to the Standard Model expectation, of μ=1.43+0.43−0.37 is consistent with the predicted Yukawa coupling strength in the Standard Model.
Resumo:
Over the past four decades the EU cohesion policy’s focus, objectives and content have experienced significant changes as a result of successive reforms aiming at adapting it to a Union in constant evolution. In the early stages, cohesion policy had eminently redistributive goals and it assumed an explicit spatial dimension. In the late nineties, the possibility of an extension towards Eastern European countries and the limited willingness of net contributors to increase funding led to a turning point in cohesion policy. The increased importance of economic growth and job creation in the 2000’s, within the cohesion policy’s context, has led to a misrepresentation of its essence and motivations. Cohesion was losing importance towards competitiveness and regional equity towards national efficiency. Today, cohesion policy is for many EU countries the main mean for mobilising investment in a context of budgetary constraints and credit rationing. In light of the available evidence, it is likely that the overall design and priorities of the current cohesion policy have a limited impact in terms of convergence in many EU regions, especially in the less developed regions. This paper’s main objectives are to analyse the evolution of European cohesion policy throughout its history, to present a picture of cohesion policy in the 2014-2020 programming period and to discuss the main problems associated with its design, priorities and programming model.
Resumo:
We perform Monte-Carlo simulations of the three-dimensional Ising model at the critical temperature and zero magnetic field. We simulate the system in a ball with free boundary conditions on the two dimensional spherical boundary. Our results for one and two point functions in this geometry are consistent with the predictions from the conjectured conformal symmetry of the critical Ising model.
Resumo:
High transverse momentum jets produced in pp collisions at a centre of mass energy of 7 TeV are used to measure the transverse energy--energy correlation function and its associated azimuthal asymmetry. The data were recorded with the ATLAS detector at the LHC in the year 2011 and correspond to an integrated luminosity of 158 pb−1. The selection criteria demand the average transverse momentum of the two leading jets in an event to be larger than 250 GeV. The data at detector level are well described by Monte Carlo event generators. They are unfolded to the particle level and compared with theoretical calculations at next-to-leading-order accuracy. The agreement between data and theory is good and provides a precision test of perturbative Quantum Chromodynamics at large momentum transfers. From this comparison, the strong coupling constant given at the Z boson mass is determined to be αs(mZ)=0.1173±0.0010 (exp.) +0.0065−0.0026 (theo.).
Resumo:
OBJECTIVE The aim of this study was to compare the performance of the current conventional Pap smear with liquid-based cytology (LBC) preparations. STUDY DESIGN Women routinely undergoing their cytopathological and histopathological examinations at Fundação Oncocentro de São Paulo (FOSP) were recruited for LBC. Conventional smears were analyzed from women from other areas of the State of São Paulo with similar sociodemographic characteristics. RESULTS A total of 218,594 cases were analyzed, consisting of 206,999 conventional smears and 11,595 LBC. Among the conventional smears, 3.0% were of unsatisfactory preparation; conversely, unsatisfactory LBC preparations accounted for 0.3%. The ASC-H (atypical squamous cells - cannot exclude high-grade squamous intraepithelial lesion) frequency did not demonstrate any differences between the two methods. In contrast, the incidence of ASC-US (atypical squamous cells of undetermined significance) was almost twice as frequent between LBC and conventional smears, at 2.9 versus 1.6%, respectively. An equal percentage of high-grade squamous intraepithelial lesions were observed for the two methods, but not for low-grade squamous intraepithelial lesions, which were more significantly observed in LBC preparations than in conventional smears (2.2 vs. 0.7%). The index of positivity was importantly enhanced from 3.0% (conventional smears) to 5.7% (LBC). CONCLUSIONS LBC performed better than conventional smears, and we are truly confident that LBC can improve public health strategies aimed at reducing cervical lesions through prevention programs.
Resumo:
A teoria institucional constituiu o enquadramento no qual foi suportada a pergunta geral desta investigação: como e porquê a Normalização da Contabilidade de Gestão (NCG) nos hospitais públicos portugueses surgiu e evoluiu? O objetivo geral foi compreender de forma profunda o surgimento e a mudança nas regras de NCG dos hospitais públicos portugueses no período histórico 1954-2011. Face ao enquadramento institucional que justificou uma investigação interpretativa, foi usado como método de investigação um estudo de caso explanatório. A evidência sobre o caso da NCG nos hospitais públicos portugueses foi recolhida em documentos e através de 58 entrevistas realizadas em 47 unidades de análise (nos serviços centrais de contabilidade do Ministério da Saúde e em 46 hospitais públicos, num total de 53 existentes). Quanto aos principais resultados obtidos, no período 1954-1974, as regras criadas pelo poder político para controlo dos gastos públicos e a contabilidade orçamental de base de caixa estiveram na génese dos primeiros conceitos de Contabilidade de Gestão (CG) para os serviços públicos de saúde portugueses. A transição de um regime ditatorial para um regime democrático (25 de Abril de 1974), a criação do Plano Oficial de Contabilidade (POC/77) e a implementação de um estado social com Serviço Nacional de Saúde (SNS) criaram a conjuntura crítica necessária para o surgimento de um Plano Oficial de Contabilidade para os Serviços de Saúde (POCSS/80) que incluiu regras de CG. A primeira edição do Plano de Contabilidade Analítica dos Hospitais (PCAH), aprovada em 1996, não foi uma construção de raiz, mas antes uma adaptação para os hospitais das regras de CG incluídas no POCSS/91 que havia revisto o POCSS/80. Após o início da implementação do PCAH, em 1998, ocorreram sequências de autorreforço institucionalizadoras destas normas, no período 1998-2011, por influência de pressões isomórficas coercivas que delinearam um processo de evolução incremental cujo resultado foi uma reprodução por adaptação, num contexto de dependência de recursos. Vários agentes internos e externos pressionaram, no período 2003-2011, através de sequências reativas para a desinstitucionalização do PCAH em resposta ao persistente fenómeno de loose coupling. Mas o PCAH só foi descontinuado nos hospitais com privatização da governação e rejeição dos anteriores sistemas de informação. Ao nível da extensão da teoria, este estudo de caso adotou o institucionalismo histórico na investigação em CG, quanto se sabe pela primeira vez, que se mostra útil na interpretação dos processos e dos resultados da criação e evolução de instituições de CG num determinado contexto histórico. Na condição de dependência de recursos, as sequências de autorreforço, via isomorfismo coercivo, tendem para uma institucionalização com fenómeno de loose coupling. Como resposta a este fenómeno, ocorrem sequências reativas no sentido da desinstitucionalização. Perante as pressões (políticas, funcionais, sociais e tecnológicas) desinstitucionalizadoras, o fator governação privada acelera o processo de desinstitucionalização, enquanto o fator governação pública impede ou abranda esse processo.
Resumo:
Tese de Doutoramento em Engenharia de Materiais.