680 resultados para CERN
Resumo:
BACKGROUND Canine inflammatory bowel disease (IBD) is a chronic enteropathy of unknown etiology, although microbiome dysbiosis, genetic susceptibility, and dietary and/or environmental factors are hypothesized to be involved in its pathogenesis. Since some of the current therapies are associated with severe side effects, novel therapeutic modalities are needed. A new oral supplement for long-term management of canine IBD containing chondroitin sulfate (CS) and prebiotics (resistant starch, β-glucans and mannaoligosaccharides) was developed to target intestinal inflammation and oxidative stress, and restore normobiosis, without exhibiting any side effects. This double-blinded, randomized, placebo-controlled trial in dogs with IBD aims to evaluate the effects of 180 days administration of this supplement together with a hydrolyzed diet on clinical signs, intestinal histology, gut microbiota, and serum biomarkers of inflammation and oxidative stress. RESULTS Twenty-seven client-owned biopsy-confirmed IBD dogs were included in the study, switched to the same hydrolyzed diet and classified into one of two groups: supplement and placebo. Initially, there were no significant differences between groups (p > 0.05) for any of the studied parameters. Final data analysis (supplement: n = 9; placebo: n = 10) showed a significant decrease in canine IBD activity index (CIBDAI) score in both groups after treatment (p < 0.001). After treatment, a significant decrease (1.53-fold; p < 0.01) in histologic score was seen only in the supplement group. When groups were compared, the supplement group showed significantly higher serum cholesterol (p < 0.05) and paraoxonase-1 (PON1) levels after 60 days of treatment (p < 0.01), and the placebo group showed significantly reduced serum total antioxidant capacity (TAC) levels after 120 days (p < 0.05). No significant differences were found between groups at any time point for CIBDAI, WSAVA histologic score and fecal microbiota evaluated by PCR-restriction fragment length polymorphism (PCR-RFLP). No side effects were reported in any group. CONCLUSIONS The combined administration of the supplement with hydrolyzed diet over 180 days was safe and induced improvements in selected serum biomarkers, possibly suggesting a reduction in disease activity. This study was likely underpowered, therefore larger studies are warranted in order to demonstrate a supplemental effect to dietary treatment of this supplement on intestinal histology and CIBDAI.
Resumo:
Le sujet de cette recherche porte sur l'évaluation des apprentissages en créativité au collégial. Il est possible d'évaluer des apprentissages en créativité en s'appuyant sur les fondements de l'évaluation dans une approche par compétences notamment ce qui relève des procédés reliés au jugement professionnel. Par conséquent, l'objectif général de cette recherche-développement vise à cerner les critères d'évaluation associés à la créativité et à développer une grille d'évaluation à échelles descriptives pour aider les enseignantes et les enseignants des programmes d'études cibles éprouvant des difficultés à évaluer les apprentissages en créativité. Pour ce faire, nous avons réalisé une recherche-développement s'inscrivant dans un paradigme compréhensif/interprétatif ainsi qu'une méthodologie de recherche qualitative/interprétative. Cette recherche a ciblé trois programmes d'études offerts au cégep Marie-Victorin pour leur diversité de point de vue. Il s'agit des programmes d'arts plastiques, de techniques d'éducation spécialisée et de design de mode. Trois individus du corps professoral, un par programme d'études cible, ont été sélectionnés pour participer à cette recherche. Deux entrevues individuelles semi-dirigées à des étapes distinctes ainsi qu'une observation de matériel pédagogique représentent les techniques utilisées lors de la collecte et de l'analyse des données. Nous avons déduit quatre faits saillants des résultats de notre recherche-développement qui selon nous, peuvent alimenter les enseignantes et les enseignants des programmes d'études cibles éprouvant des difficultés à évaluer les apprentissages en créativité rejoignant ainsi notre objectif général de recherche. D'abord, malgré la diversité des domaines choisis, il y existe plusieurs similitudes. Les résultats ont confirmé notre hypothèse de départ qu'il était possible de repérer parmi des domaines différents des caractéristiques et des qualités communes en ce qui concerne l'interprétation du concept de créativité. Ces similitudes ont orienté la production des outils développés vers une forme générique. En deuxième lieu, la comparaison des données provenant des trois programmes cibles a mené à la formulation de onze critères pouvant s'appliquer de façon générique à ceux-ci. La validation interne des outils développés a fait émerger les deux autres faits saillants. En ce qui concerne le choix des critères d'évaluation génériques et la pertinence d'avoir construit trois échelles descriptives globales selon le produit créatif, le processus créatif et la personne créative/propos, notre décision d'avoir cerné et séparé les critères d'évaluation selon ces 3P associés à l'évaluation a été confirmée. Les propos des trois individus interviewés ont révélé que cette distinction peut contribuer à démystifier l'évaluation des apprentissages en créativité en favorisant une meilleure interprétation du produit créatif, du processus créatif et de la personne créative/propos ainsi que les caractéristiques et les qualités qui servent à les évaluer. Finalement, certains résultats lors de l'étape de la validation interne ont révélé que les outils développés ne sont pas figés dans leur forme ni dans leur contenu et peuvent évoluer et s'adapter aux besoins du domaine visé. En ce sens, nous considérons que l'aspect pragmatique de recherche a été rencontré, le désir de développer des outils d'évaluation pouvant aider les enseignantes et les enseignants dans un contexte pratique où il s'agit d'évaluer les apprentissages en créativité. Les résultats obtenus ont fait émerger quelques pistes de recherches futures notamment a) une expérimentation en classe des outils développés lors d'une recherche évaluative ou d'une recherche-action et b) une recherche-développement pour produire une version adaptée des outils propre à une situation donnée dans un domaine.
Resumo:
El presente estudio describe los cambios en la políticas públicas de niñez en Colombia entre 1991 y 2014, mediante el análisis tanto de las narrativas y contranarrativas de política gestadas en este periodo de tiempo, como de los factores que propiciaron la conformación de una red de política pública y el posterior desarrollo de diversos modos de interacción entre los grupos de actores identificados. Parte esta investigación de situar antecedentes y factores relevantes que permiten contrastar, por un lado, contenidos y perspectivas entre diferentes periodos de tiempo, y por otro lado, el número, tipo, y dinámica de relaciones entre los diversos actores involucrados en este campo antes de 1991, y posterior a este año hasta 2014. En síntesis, a través de los hallazgos y análisis realizados se busca no solo plantear qué se transformó entre 1991 y 2014 respecto a la construcción de la niñez como referente de políticas públicas, sino también, cómo se gestó esta transformación, con el fin de proveer elementos que permitan comprender principalmente los énfasis y las variaciones que han tenido las políticas públicas de niñez en el país, pero también, algunas continuidades por periodos más específicos de tiempo.
Resumo:
El pandillismo es un fenómeno social que se presenta en las principales ciudades de Colombia desde hace varias décadas, sin embargo, solo en los últimos años el fenómeno a comenzado a acaparar mayor atención mediática e institucional. El pandillismo se ha convertido en una preocupación para la seguridad ciudadana, especialmente en las áreas urbanas. La Policía Nacional, en conjunto a Instituciones locales y nacionales ha venido realizando un trabajo focalizado para hacerle pie al fenómeno. En este trabajo se realizó una revisión histórica del pandillismo en Bogotá y un panorama continental. De la misma manera, se hace una descripción de los programas y estrategias que la Policía Nacional y las entidades públicas han realizado desde el año 2004 hasta el año 2015. Por último se realiza una evaluación del trabajo institucional enfocado en el sector del Codito, en la localidad de Usaquén.
Resumo:
El presente estudio de caso tiene como objetivo analizar los retos frente al cumplimiento del Protocolo de Palermo, en materia de explotación sexual, en el Sistema Institucional Colombiano, durante el período comprendido entre los años 2003 y 2014. De manera preliminar se indica que dichos retos son político-jurídicos en cuanto a la manera en la que se cumplen los tratados internacionales, las obligaciones derivadas de ellos y quiénes tienen competencia para desarrollar los mandatos contenidos en los mismos. Igualmente existen retos institucionales y organizacionales pues Colombia carece de una estructura organizacional clara y la coordinación inter-agencial es escasa en materia de trata. En este sentido, esta investigación tiene un enfoque multidisciplinar, puesto que combina aspectos propios tanto del Derecho Internacional Público como de las Relaciones Internacionales. Para ello, se hará un análisis cualitativo por medio del análisis de datos y de literatura académica respecto a la trata de personas en Colombia, con el fin de lograr comprender con mayor precisión el panorama actual del país respecto al flagelo.
Resumo:
La presente investigación consiste en determinar las aplicaciones existentes de las teorías del caos y las teorías de la complejidad en la cadena de suministro del sector agroindustrial colombiano. Además, tiene como propósito describir el sector de la agroindustria y la cadena de suministro, identificar los modelos de caos y complejidad y posteriormente determinar cuáles de éstos son aplicables al sector. Se define el caos como una sub-disciplina de las matemáticas que estudia sistemas complejos o dinámicos y tiene inmerso implicaciones filosóficas; por otra parte complejidad es la cualidad que adquiere un sistema en el que hay diversos componentes relacionados. Se ha identificado que en el ámbito colombiano existen diferentes estudios enfocados en la construcción de modelos agroindustriales, donde se adopta el concepto de complejidad para calificar el atributo de dichos modelos que involucran la armonización e integración de diferentes actores, desde los productores hasta los consumidores. En este estudio se emplea un estudio monográfico de tipo documental teniendo como unidad de análisis la cadena de suministro del sector agroindustrial. Los resultados indican que las teorías del caos y complejidad se encuentran presentes dentro de la cadena de suministros del sector agroindustrial colombiano, ya que en ella se ocurre la interconexión entre productores, procesadores y comercializadores, interactuando entre ellos y presentando alteraciones en su comportamiento económico a lo largo del tiempo en función de variaciones de las condiciones iniciales influenciadas por variables macroeconómicas, ambientales, sociales y políticas.
Resumo:
Este trabajo es para fines institucionales de la Universidad del Rosario. A lo largo del segundo semestre del año 2016 entendimos la importancia de los medios digitales para las empresas, por lo cual el profesor Juan Manuel Méndez nos dio el caso de las bicicletas de Nairo Quintana con el cual teníamos que desarrollar un plan de estrategias digitales acorde con el contexto actual del sector de las bicicletas, en donde se creara y desarrollara la marca de la agencia digital creada por los estudiantes, describir el producto innovador, desarrollar nuestros objetivos en los medios digitales, identificar y definir nuestro mercado objetivo (Target), crear todo un flow de medios con cronogramas, fechas y porcentaje de inversión en cada medio y por ultimo identificar los KPI´s de nuestra campaña propuesta según los objetivos planteados.
Resumo:
Non-linear effects are responsible for peculiar phenomena in charged particles dynamics in circular accelerators. Recently, they have been used to propose novel beam manipulations where one can modify the transverse beam distribution in a controlled way, to fulfil the constraints posed by new applications. One example is the resonant beam splitting used at CERN for the Multi-Turn Extraction (MTE), to transfer proton beams from PS to SPS. The theoretical description of these effects relies on the formulation of the particle's dynamics in terms of Hamiltonian systems and symplectic maps, and on the theory of adiabatic invariance and resonant separatrix crossing. Close to resonance, new stable regions and new separatrices appear in the phase space. As non-linear effects do not preserve the Courant-Snyder invariant, it is possible for a particle to cross a separatrix, changing the value of its adiabatic invariant. This process opens the path to new beam manipulations. This thesis deals with various possible effects that can be used to shape the transverse beam dynamics, using 2D and 4D models of particles' motion. We show the possibility of splitting a beam using a resonant external exciter, or combining its action with MTE-like tune modulation close to resonance. Non-linear effects can also be used to cool a beam acting on its transverse beam distribution. We discuss the case of an annular beam distribution, showing that emittance can be reduced modulating amplitude and frequency of a resonant oscillating dipole. We then consider 4D models where, close to resonance, motion in the two transverse planes is coupled. This is exploited to operate on the transverse emittances with a 2D resonance crossing. Depending on the resonance, the result is an emittance exchange between the two planes, or an emittance sharing. These phenomena are described and understood in terms of adiabatic invariance theory.
Resumo:
The Standard Model (SM) of particle physics predicts the existence of a Higgs field responsible for the generation of particles' mass. However, some aspects of this theory remain unsolved, supposing the presence of new physics Beyond the Standard Model (BSM) with the production of new particles at a higher energy scale compared to the current experimental limits. The search for additional Higgs bosons is, in fact, predicted by theoretical extensions of the SM including the Minimal Supersymmetry Standard Model (MSSM). In the MSSM, the Higgs sector consists of two Higgs doublets, resulting in five physical Higgs particles: two charged bosons $H^{\pm}$, two neutral scalars $h$ and $H$, and one pseudoscalar $A$. The work presented in this thesis is dedicated to the search of neutral non-Standard Model Higgs bosons decaying to two muons in the model independent MSSM scenario. Proton-proton collision data recorded by the CMS experiment at the CERN LHC at a center-of-mass energy of 13 TeV are used, corresponding to an integrated luminosity of $35.9\ \text{fb}^{-1}$. Such search is sensitive to neutral Higgs bosons produced either via gluon fusion process or in association with a $\text{b}\bar{\text{b}}$ quark pair. The extensive usage of Machine and Deep Learning techniques is a fundamental element in the discrimination between signal and background simulated events. A new network structure called parameterised Neural Network (pNN) has been implemented, replacing a whole set of single neural networks trained at a specific mass hypothesis value with a single neural network able to generalise well and interpolate in the entire mass range considered. The results of the pNN signal/background discrimination are used to set a model independent 95\% confidence level expected upper limit on the production cross section times branching ratio, for a generic $\phi$ boson decaying into a muon pair in the 130 to 1000 GeV range.
Resumo:
With the CERN LHC program underway, there has been an acceleration of data growth in the High Energy Physics (HEP) field and the usage of Machine Learning (ML) in HEP will be critical during the HL-LHC program when the data that will be produced will reach the exascale. ML techniques have been successfully used in many areas of HEP nevertheless, the development of a ML project and its implementation for production use is a highly time-consuming task and requires specific skills. Complicating this scenario is the fact that HEP data is stored in ROOT data format, which is mostly unknown outside of the HEP community. The work presented in this thesis is focused on the development of a ML as a Service (MLaaS) solution for HEP, aiming to provide a cloud service that allows HEP users to run ML pipelines via HTTP calls. These pipelines are executed by using the MLaaS4HEP framework, which allows reading data, processing data, and training ML models directly using ROOT files of arbitrary size from local or distributed data sources. Such a solution provides HEP users non-expert in ML with a tool that allows them to apply ML techniques in their analyses in a streamlined manner. Over the years the MLaaS4HEP framework has been developed, validated, and tested and new features have been added. A first MLaaS solution has been developed by automatizing the deployment of a platform equipped with the MLaaS4HEP framework. Then, a service with APIs has been developed, so that a user after being authenticated and authorized can submit MLaaS4HEP workflows producing trained ML models ready for the inference phase. A working prototype of this service is currently running on a virtual machine of INFN-Cloud and is compliant to be added to the INFN Cloud portfolio of services.
Diffusive models and chaos indicators for non-linear betatron motion in circular hadron accelerators
Resumo:
Understanding the complex dynamics of beam-halo formation and evolution in circular particle accelerators is crucial for the design of current and future rings, particularly those utilizing superconducting magnets such as the CERN Large Hadron Collider (LHC), its luminosity upgrade HL-LHC, and the proposed Future Circular Hadron Collider (FCC-hh). A recent diffusive framework, which describes the evolution of the beam distribution by means of a Fokker-Planck equation, with diffusion coefficient derived from the Nekhoroshev theorem, has been proposed to describe the long-term behaviour of beam dynamics and particle losses. In this thesis, we discuss the theoretical foundations of this framework, and propose the implementation of an original measurement protocol based on collimator scans in view of measuring the Nekhoroshev-like diffusive coefficient by means of beam loss data. The available LHC collimator scan data, unfortunately collected without the proposed measurement protocol, have been successfully analysed using the proposed framework. This approach is also applied to datasets from detailed measurements of the impact on the beam losses of so-called long-range beam-beam compensators also at the LHC. Furthermore, dynamic indicators have been studied as a tool for exploring the phase-space properties of realistic accelerator lattices in single-particle tracking simulations. By first examining the classification performance of known and new indicators in detecting the chaotic character of initial conditions for a modulated Hénon map and then applying this knowledge to study the properties of realistic accelerator lattices, we tried to identify a connection between the presence of chaotic regions in the phase space and Nekhoroshev-like diffusive behaviour, providing new tools to the accelerator physics community.
Resumo:
In the near future, the LHC experiments will continue to be upgraded as the LHC luminosity will increase from the design 1034 to 7.5 × 1034, with the HL-LHC project, to reach 3000 × f b−1 of accumulated statistics. After the end of a period of data collection, CERN will face a long shutdown to improve overall performance by upgrading the experiments and implementing more advanced technologies and infrastructures. In particular, ATLAS will upgrade parts of the detector, the trigger, and the data acquisition system. It will also implement new strategies and algorithms for processing and transferring the data to the final storage. This PhD thesis presents a study of a new pattern recognition algorithm to be used in the trigger system, which is a software designed to provide the information necessary to select physical events from background data. The idea is to use the well-known Hough Transform mathematical formula as an algorithm for detecting particle trajectories. The effectiveness of the algorithm has already been validated in the past, independently of particle physics applications, to detect generic shapes in images. Here, a software emulation tool is proposed for the hardware implementation of the Hough Transform, to reconstruct the tracks in the ATLAS Trigger and Data Acquisition system. Until now, it has never been implemented on electronics in particle physics experiments, and as a hardware implementation it would provide overall latency benefits. A comparison between the simulated data and the physical system was performed on a Xilinx UltraScale+ FPGA device.
Resumo:
The scientific success of the LHC experiments at CERN highly depends on the availability of computing resources which efficiently store, process, and analyse the amount of data collected every year. This is ensured by the Worldwide LHC Computing Grid infrastructure that connect computing centres distributed all over the world with high performance network. LHC has an ambitious experimental program for the coming years, which includes large investments and improvements both for the hardware of the detectors and for the software and computing systems, in order to deal with the huge increase in the event rate expected from the High Luminosity LHC (HL-LHC) phase and consequently with the huge amount of data that will be produced. Since few years the role of Artificial Intelligence has become relevant in the High Energy Physics (HEP) world. Machine Learning (ML) and Deep Learning algorithms have been successfully used in many areas of HEP, like online and offline reconstruction programs, detector simulation, object reconstruction, identification, Monte Carlo generation, and surely they will be crucial in the HL-LHC phase. This thesis aims at contributing to a CMS R&D project, regarding a ML "as a Service" solution for HEP needs (MLaaS4HEP). It consists in a data-service able to perform an entire ML pipeline (in terms of reading data, processing data, training ML models, serving predictions) in a completely model-agnostic fashion, directly using ROOT files of arbitrary size from local or distributed data sources. This framework has been updated adding new features in the data preprocessing phase, allowing more flexibility to the user. Since the MLaaS4HEP framework is experiment agnostic, the ATLAS Higgs Boson ML challenge has been chosen as physics use case, with the aim to test MLaaS4HEP and the contribution done with this work.
Resumo:
In the upcoming years, various upgrades and improvements are planned for the CERN Large Hadron Collider (LHC) and represent the mandate of the High-Luminosity project. The upgrade will allow for a total stored beam energy of about 700 MJ, which will need, among others, an extremely efficient collimation system. This will be achieved with the addition of a hollow electron lens (HEL) system to help control the beam-halo depletion and mitigate the effects of fast beam losses. In this master thesis, we present a diffusion model of the HEL for HL-LHC. In particular, we explore several scenarios to use such a device, focusing on the halo depletion efficiency given by different noise regimes.
Resumo:
La seguente tesi studia il processo γγ → WW osservato per la prima volta dall’esperimento ATLAS nel 2017 facendo collidere due fasci di protoni con un’energia totale di 13 GeV. Grazie all’analisi dell’evento (in un dominio da 0 a 500 GeV in momento trasverso) è possibile misurare il valore (o trovare l’intervallo di confidenza) di determinati coefficienti associati alla lagrangiana in EFT. Si pensa infatti che questo evento, reso possibile anche dalla lagrangiana del Modello Standard, possa avvenire sfruttando teorie oltre il Modello Standard sull’accoppiamento quartico tra bosoni vettori, identificati da operatori di dimensione otto. Lo scopo ultimo dell’analisi è quello di trovare l’intervallo di confidenza all’interno del quale ci si aspetta di trovare questi coefficienti (associati agli operatori di dimensione otto), con livello di confidenza del 95%. Tali misurazioni sono estremamente importanti per la fisica teorica in quanto permettono di verificare la veridicità dei modelli teorici e accertarsi che questi ultimi siano compatibili con i dati. Lo scopo di questa tesi é antecedente alla misurazione e consiste, grazie alle simulazioni, nell’applicare dei processi di selezione e restringere il dominio delle distribuzioni dei dati (in momento trasverso), così da ottimizzare le misurazioni. E' stato trovato infatti che se si restringe il dominio del momento trasverso si riesce a misurare un più ampio e accurato range di valori per i coefficienti di EFT. Nello specifico questa tesi si é occupata di studiare alcuni di questi coefficienti (M), trovando la miglior regione per l’analisi ([180,500] GeV) e, per ogni coefficiente, il limite inferiore con un C.L. del 95%.