911 resultados para Case Based Computing
Resumo:
The femicide in Ciudad Juárez is a story made of extreme violence against women for different reasons, by different actors, under different circumstances, and following different behavioural patterns. All within a gender discrimination frame based on the idea that women are inferior, interchangeable and disposable according to the patriarchal hierarchy still present in Mexico, but strongly reinforced by a sort of conspiracy of silence provoked either by the high impunity rate, the governmental incompetence to solve the crimes, or the general indifference of the population. It is the story of hundreds of kidnapped, raped, in many cases tortured, and murdered young women in the border between Mexico and the United States. The murders first came into light in 1993 and up to now young women continue to “disappear” without any hope of bringing the perpetrators to justice, stopping impunity, convicting the assassins, and bringing justice to the families of the deceased girls and women. The main questions about femicide in Ciudad Juárez seem to be: why were they brutally assassinated?, why most of the crimes have not been solved yet?, why and how is Ciudad Juárez different from other border cities with the same characteristics?, which powers are behind those crimes in a city that implies mainly women as its labor force, and which has the lowest unemployment rate in the whole country? But there are also many other questions dealing more with the context, the Juarences’ lifestyles, the eventual hidden powers behind the crimes, the possible murderers’ reasons, the response of the local civil society, or the international community actions to fight against femicide there, among many other things, that are still waiting for an answer and that this paper will ‘narrate’ in order to provide a holistic panorama for the readers. But above all there is the need to remember that every single woman or girl assassinated there had a name, an identity, a family, a story to be told time after time and as many times as necessary, in order to avoid accepting these crimes just as statistics, as cold numbers that might make us forget the human tragedy that has been flagellating the city since 1993. We must remember as well that their deaths express gender oppression, the inequality of the relations between what is male and what is female, a manifestation of domination, terror, social extermination, patriarchal hegemony, social class and impunity. The city is the perfect mirror where all the contradictions of globalization get reflected. It is there where all the globalization evils are present and survive by sucking their women’s blood. It is a city where some concepts such as gender, migration and power are closely related with a negative connotation.
Resumo:
This thesis addresses the formulation of a referee assignment problem for the Italian Volleyball Serie A Championships. The problem has particular constraints such as a referee must be assigned to different teams in a given period of times, and the minimal/maximal level of workload for each referee is obtained by considering cost and profit in the objective function. The problem has been solved through an exact method by using an integer linear programming formulation and a clique based decomposition for improving the computing time. Extensive computational experiments on real-world instances have been performed to determine the effectiveness of the proposed approach.
Resumo:
Background: Clinical trials have demonstrated that selected secondary prevention medications for patients after acute myocardial infarction (AMI) reduce mortality. Yet, these medications are generally underprescribed in daily practice, and older people are often absent from drug trials. Objectives: To examine the relationship between adherence to evidence-based (EB) drugs and post-AMI mortality, focusing on the effects of single therapy and polytherapy in very old patients (≥80 years) compared with elderly and adults (<80 years). Methods: Patients hospitalised for AMI between 01/01/2008 and 30/06/2011 and resident in the Local Health Authority of Bologna were followed up until 31/12/2011. Medication adherence was calculated as the proportion of days covered for filled prescriptions of angiotensin-converting enzyme inhibitors (ACEIs)/angiotensin receptor blockers (ARBs), β-blockers, antiplatelet drugs, and statins. We adopted a risk set sampling method, and the adjusted relationship between medication adherence (PDC≥75%) and mortality was investigated using conditional multiple logistic regression. Results: The study population comprised 4861 patients. During a median follow-up of 2.8 years, 1116 deaths (23.0%) were observed. Adherence to the 4 EB drugs was 7.1%, while nonadherence to any of the drugs was 19.7%. For both patients aged ≥80 years and those aged <80 years, rate ratios of death linearly decreased as the number of EB drugs taken increased. There was a significant inverse relationship between adherence to each of 4 medications and mortality, although its magnitude was higher for ACEIs/ARBs (adj. rate ratio=0.60, 95%CI=0.52–0.69) and statins (0.60, 0.50–0.72), and lower for β-blockers (0.75, 0.61–0.92) and antiplatelet drugs (0.73, 0.63–0.84). Conclusions: The beneficial effect of EB polytherapy on long-term mortality following AMI is evident also in nontrial older populations. Given that adherence to combination therapies is largely suboptimal, the implementation of strategies and initiatives to increase the use of post-AMI secondary preventive medications in old patients is crucial.
Resumo:
Nowadays, data handling and data analysis in High Energy Physics requires a vast amount of computational power and storage. In particular, the world-wide LHC Com- puting Grid (LCG), an infrastructure and pool of services developed and deployed by a ample community of physicists and computer scientists, has demonstrated to be a game changer in the efficiency of data analyses during Run-I at the LHC, playing a crucial role in the Higgs boson discovery. Recently, the Cloud computing paradigm is emerging and reaching a considerable adoption level by many different scientific organizations and not only. Cloud allows to access and utilize not-owned large computing resources shared among many scientific communities. Considering the challenging requirements of LHC physics in Run-II and beyond, the LHC computing community is interested in exploring Clouds and see whether they can provide a complementary approach - or even a valid alternative - to the existing technological solutions based on Grid. In the LHC community, several experiments have been adopting Cloud approaches, and in particular the experience of the CMS experiment is of relevance to this thesis. The LHC Run-II has just started, and Cloud-based solutions are already in production for CMS. However, other approaches of Cloud usage are being thought of and are at the prototype level, as the work done in this thesis. This effort is of paramount importance to be able to equip CMS with the capability to elastically and flexibly access and utilize the computing resources needed to face the challenges of Run-III and Run-IV. The main purpose of this thesis is to present forefront Cloud approaches that allow the CMS experiment to extend to on-demand resources dynamically allocated as needed. Moreover, a direct access to Cloud resources is presented as suitable use case to face up with the CMS experiment needs. Chapter 1 presents an overview of High Energy Physics at the LHC and of the CMS experience in Run-I, as well as preparation for Run-II. Chapter 2 describes the current CMS Computing Model, and Chapter 3 provides Cloud approaches pursued and used within the CMS Collaboration. Chapter 4 and Chapter 5 discuss the original and forefront work done in this thesis to develop and test working prototypes of elastic extensions of CMS computing resources on Clouds, and HEP Computing “as a Service”. The impact of such work on a benchmark CMS physics use-cases is also demonstrated.
Resumo:
Il progresso scientifico e le innovazioni tecnologiche nei campi dell'elettronica, informatica e telecomunicazioni, stanno aprendo la strada a nuove visioni e concetti. L'obiettivo della tesi è quello d'introdurre il modello del Cloud computing per rendere possibile l'attuale visione di Internet of Thing. Nel primo capitolo si introduce Ubiquitous computing come un nuovo modo di vedere i computer, cercando di fare chiarezza sulla sua definizione, la sua nascita e fornendo un breve quadro storico. Nel secondo capitolo viene presentata la visione di Internet of Thing (Internet delle “cose”) che si avvale di concetti e di problematiche in parte già considerate con Ubiquitous computing. Internet of Thing è una visione in cui la rete Internet viene estesa agli oggetti di tutti i giorni. Tracciare la posizione degli oggetti, monitorare pazienti da remoto, rilevare dati ambientali sono solo alcuni esempi. Per realizzare questo tipo di applicazioni le tecnologie wireless sono da considerare necessarie, sebbene questa visione non assuma nessuna specifica tecnologia di comunicazione. Inoltre, anche schede di sviluppo possono agevolare la prototipazione di tali applicazioni. Nel terzo capitolo si presenta Cloud computing come modello di business per utilizzare su richiesta risorse computazionali. Nel capitolo, vengono inizialmente descritte le caratteristiche principali e i vari tipi di modelli di servizio, poi viene argomentato il ruolo che i servizi di Cloud hanno per Internet of Thing. Questo modello permette di accelerare lo sviluppo e la distribuzione di applicazioni di Internet of Thing, mettendo a disposizione capacità di storage e di calcolo per l'elaborazione distribuita dell'enorme quantità di dati prodotta da sensori e dispositivi vari. Infine, nell'ultimo capitolo viene considerato, come esempio pratico, l'integrazione di tecnologie di Cloud computing in una applicazione IoT. Il caso di studio riguarda il monitoraggio remoto dei parametri vitali, considerando Raspberry Pi e la piattaforma e-Health sviluppata da Cooking Hacks per lo sviluppo di un sistema embedded, e utilizzando PubNub come servizio di Cloud per distribuire i dati ottenuti dai sensori. Il caso di studio metterà in evidenza sia i vantaggi sia le eventuali problematiche che possono scaturire utilizzando servizi di Cloud in applicazioni IoT.
Resumo:
To date, few risk factors for childhood acute lymphoblastic leukemia (ALL) have been confirmed and the scientific literature is full of controversial "evidence." We examined if family characteristics, particularly maternal and paternal age and number of older siblings, were risk factors for childhood acute lymphoblastic leukemia (ALL).
Resumo:
In this paper we present a new population-based implant design methodology, which advances the state-of-the-art approaches by combining shape and bone quality information into the design strategy. The method enhances the mechanical stability of the fixation and reduces the intra-operative in-plane bending which might impede the functionality of the locking mechanism. The method is presented for the case of mandibular locking fixation plates, where the mandibular angle and the bone quality at screw locations are taken into account. Using computational anatomy techniques, the method automatically derives, from a set of computed tomography images, the mandibular angle and the bone thickness and intensity values at the path of every screw. An optimisation strategy is then used to optimise the two parameters of plate angle and screw position. Results for the new design are presented along with a comparison with a commercially available mandibular locking fixation plate. A statistically highly significant improvement was observed. Our experiments allowed us to conclude that an angle of 126° and a screw separation of 8mm is a more suitable design than the standard 120° and 9mm.
Resumo:
This thesis presents two frameworks- a software framework and a hardware core manager framework- which, together, can be used to develop a processing platform using a distributed system of field-programmable gate array (FPGA) boards. The software framework providesusers with the ability to easily develop applications that exploit the processing power of FPGAs while the hardware core manager framework gives users the ability to configure and interact with multiple FPGA boards and/or hardware cores. This thesis describes the design and development of these frameworks and analyzes the performance of a system that was constructed using the frameworks. The performance analysis included measuring the effect of incorporating additional hardware components into the system and comparing the system to a software-only implementation. This work draws conclusions based on the provided results of the performance analysis and offers suggestions for future work.
Resumo:
Detection of arrhythmic atrial beats in surface ECGs can be challenging when they are masked by the R or T wave, or do not affect the RR-interval. Here, we present a solution using a high-resolution esophageal long-term ECG that offers a detailed view on the atrial electrical activity. The recorded ECG shows atrial ectopic beats with long coupling intervals, which can only be successfully classified using additional morphology criteria. Esophageal high-resolution ECGs provide this information, whereas surface long-term ECGs show poor atrial signal quality. This new method is a promising tool for the long-term rhythm monitoring with software-based automatic classification of atrial beats.
Resumo:
In Switzerland there is a shortage of population-based information on heart failure (HF) incidence and case fatalities (CF). The aim of this study was to estimate HF event rates and both in- and out-of-hospital CF rates.
Resumo:
Small clusters of gallium oxide, technologically important high temperature ceramic, together with interaction of nucleic acid bases with graphene and small-diameter carbon nanotube are focus of first principles calculations in this work. A high performance parallel computing platform is also developed to perform these calculations at Michigan Tech. First principles calculations are based on density functional theory employing either local density or gradient-corrected approximation together with plane wave and gaussian basis sets. The bulk Ga2O3 is known to be a very good candidate for fabricating electronic devices that operate at high temperatures. To explore the properties of Ga2O3 at nonoscale, we have performed a systematic theoretical study on the small polyatomic gallium oxide clusters. The calculated results find that all lowest energy isomers of GamOn clusters are dominated by the Ga-O bonds over the metal-metal or the oxygen-oxygen bonds. Analysis of atomic charges suggest the clusters to be highly ionic similar to the case of bulk Ga2O3. In the study of sequential oxidation of these slusters starting from Ga2O, it is found that the most stable isomers display up to four different backbones of constituent atoms. Furthermore, the predicted configuration of the ground state of Ga2O is recently confirmed by the experimental result of Neumark's group. Guided by the results of calculations the study of gallium oxide clusters, performance related challenge of computational simulations, of producing high performance computers/platforms, has been addressed. Several engineering aspects were thoroughly studied during the design, development and implementation of the high performance parallel computing platform, rama, at Michigan Tech. In an attempt to stay true to the principles of Beowulf revolutioni, the rama cluster was extensively customized to make it easy to understand, and use - for administrators as well as end-users. Following the results of benchmark calculations and to keep up with the complexity of systems under study, rama has been expanded to a total of sixty four processors. Interest in the non-covalent intereaction of DNA with carbon nanotubes has steadily increased during past several years. This hybrid system, at the junction of the biological regime and the nanomaterials world, possesses features which make it very attractive for a wide range of applicatioins. Using the in-house computational power available, we have studied details of the interaction between nucleic acid bases with graphene sheet as well as high-curvature small-diameter carbon nanotube. The calculated trend in the binding energies strongly suggests that the polarizability of the base molecules determines the interaction strength of the nucleic acid bases with graphene. When comparing the results obtained here for physisorption on the small diameter nanotube considered with those from the study on graphene, it is observed that the interaction strength of nucleic acid bases is smaller for the tube. Thus, these results show that the effect of introducing curvature is to reduce the binding energy. The binding energies for the two extreme cases of negligible curvature (i.e. flat graphene sheet) and of very high curvature (i.e. small diameter nanotube) may be considered as upper and lower bounds. This finding represents an important step towards a better understanding of experimentally observed sequence-dependent interaction of DNA with Carbon nanotubes.