898 resultados para Election Counting and Reporting Software,
Resumo:
In this thesis the performances of the CMS Drift Tubes Local Trigger System of the CMS detector are studied. CMS is one of the general purpose experiments that will operate at the Large Hadron Collider at CERN. Results from data collected during the Cosmic Run At Four Tesla (CRAFT) commissioning exercise, a globally coordinated run period where the full experiment was involved and configured to detect cosmic rays crossing the CMS cavern, are presented. These include analyses on the precision and accuracy of the trigger reconstruction mechanism and measurement of the trigger efficiency. The description of a method to perform system synchronization is also reported, together with a comparison of the outcomes of trigger electronics and its software emulator code.
Resumo:
The quench characteristics of second generation (2 G) YBCO Coated Conductor (CC) tapes are of fundamental importance for the design and safe operation of superconducting cables and magnets based on this material. Their ability to transport high current densities at high temperature, up to 77 K, and at very high fields, over 20 T, together with the increasing knowledge in their manufacturing, which is reducing their cost, are pushing the use of this innovative material in numerous system applications, from high field magnets for research to motors and generators as well as for cables. The aim of this Ph. D. thesis is the experimental analysis and numerical simulations of quench in superconducting HTS tapes and coils. A measurements facility for the characterization of superconducting tapes and coils was designed, assembled and tested. The facility consist of a cryostat, a cryocooler, a vacuum system, resistive and superconducting current leads and signal feedthrough. Moreover, the data acquisition system and the software for critical current and quench measurements were developed. A 2D model was developed using the finite element code COMSOL Multiphysics R . The problem of modeling the high aspect ratio of the tape is tackled by multiplying the tape thickness by a constant factor, compensating the heat and electrical balance equations by introducing a material anisotropy. The model was then validated both with the results of a 1D quench model based on a non-linear electric circuit coupled to a thermal model of the tape, to literature measurements and to critical current and quench measurements made in the cryogenic facility. Finally the model was extended to the study of coils and windings with the definition of the tape and stack homogenized properties. The procedure allows the definition of a multi-scale hierarchical model, able to simulate the windings with different degrees of detail.
Resumo:
This PhD Thesis includes five main parts on diverse topics. The first two parts deal with the trophic ecology of wolves in Italy consequently to a recent increase of wild ungulates abundance. Data on wolf diet across time highlighted how wild ungulates are important food resource for wolves in Italy. Increasing wolf population, increasing numbers of wild ungulates and decreasing livestock consume are mitigating wolf-man conflicts in Italy in the near future. In the third part, non-invasive genetic sampling techniques were used to obtain genotypes and genders of about 400 wolves. Thus, wolf packs were genetically reconstructed using diverse population genetic and parentage software. Combining the results on pack structure and genetic relatedness with sampling locations, home ranges of wolf packs and dispersal patterns were identified. These results, particularly important for the conservation management of wolves in Italy, illustrated detailed information that can be retrieved from genetic identification of individuals. In the fourth part, wolf locations were combined with environmental information obtained as GIS-layers. Modern species distribution models (niche models) were applied to infer potential wolf distribution and predation risk. From the resulting distribution maps, information pastures with the highest risk of depredation were derived. This is particularly relevant as it allows identifying those areas under danger of carnivore attack on livestock. Finally, in the fifth part, habitat suitability models were combined with landscape genetic analysis. On one side landscape genetic analyses on the Italian wolves provided new information on the dynamics and connectivity of the population and, on the other side, a profound analysis of the effects that habitat suitability methods had on the parameterization of landscape genetic analyses was carried out to contributed significantly to landscape genetic theory.
Resumo:
La prova informatica richiede l’adozione di precauzioni come in un qualsiasi altro accertamento scientifico. Si fornisce una panoramica sugli aspetti metodologici e applicativi dell’informatica forense alla luce del recente standard ISO/IEC 27037:2012 in tema di trattamento del reperto informatico nelle fasi di identificazione, raccolta, acquisizione e conservazione del dato digitale. Tali metodologie si attengono scrupolosamente alle esigenze di integrità e autenticità richieste dalle norme in materia di informatica forense, in particolare della Legge 48/2008 di ratifica della Convenzione di Budapest sul Cybercrime. In merito al reato di pedopornografia si offre una rassegna della normativa comunitaria e nazionale, ponendo l’enfasi sugli aspetti rilevanti ai fini dell’analisi forense. Rilevato che il file sharing su reti peer-to-peer è il canale sul quale maggiormente si concentra lo scambio di materiale illecito, si fornisce una panoramica dei protocolli e dei sistemi maggiormente diffusi, ponendo enfasi sulla rete eDonkey e il software eMule che trovano ampia diffusione tra gli utenti italiani. Si accenna alle problematiche che si incontrano nelle attività di indagine e di repressione del fenomeno, di competenza delle forze di polizia, per poi concentrarsi e fornire il contributo rilevante in tema di analisi forensi di sistemi informatici sequestrati a soggetti indagati (o imputati) di reato di pedopornografia: la progettazione e l’implementazione di eMuleForensic consente di svolgere in maniera estremamente precisa e rapida le operazioni di analisi degli eventi che si verificano utilizzando il software di file sharing eMule; il software è disponibile sia in rete all’url http://www.emuleforensic.com, sia come tool all’interno della distribuzione forense DEFT. Infine si fornisce una proposta di protocollo operativo per l’analisi forense di sistemi informatici coinvolti in indagini forensi di pedopornografia.
Resumo:
Our generation of computational scientists is living in an exciting time: not only do we get to pioneer important algorithms and computations, we also get to set standards on how computational research should be conducted and published. From Euclid’s reasoning and Galileo’s experiments, it took hundreds of years for the theoretical and experimental branches of science to develop standards for publication and peer review. Computational science, rightly regarded as the third branch, can walk the same road much faster. The success and credibility of science are anchored in the willingness of scientists to expose their ideas and results to independent testing and replication by other scientists. This requires the complete and open exchange of data, procedures and materials. The idea of a “replication by other scientists” in reference to computations is more commonly known as “reproducible research”. In this context the journal “EAI Endorsed Transactions on Performance & Modeling, Simulation, Experimentation and Complex Systems” had the exciting and original idea to make the scientist able to submit simultaneously the article and the computation materials (software, data, etc..) which has been used to produce the contents of the article. The goal of this procedure is to allow the scientific community to verify the content of the paper, reproducing it in the platform independently from the OS chosen, confirm or invalidate it and especially allow its reuse to reproduce new results. This procedure is therefore not helpful if there is no minimum methodological support. In fact, the raw data sets and the software are difficult to exploit without the logic that guided their use or their production. This led us to think that in addition to the data sets and the software, an additional element must be provided: the workflow that relies all of them.
Resumo:
The European LeukemiaNet (ELN), workpackage 10 (WP10) was designed to deal with diagnosis matters using morphology and immunophenotyping. This group aimed at establishing a consensus on the required reagents for proper immunophenotyping of acute leukemia and lymphoproliferative disorders. Animated discussions within WP10, together with the application of the Delphi method of proposals circulation, quickly led to post-consensual immunophenotyping panels for disorders on the ELN website. In this report, we established a comprehensive description of these panels, both mandatory and complementary, for both types of clinical conditions. The reason for using each marker, sustained by relevant literature information, is provided in detail. With the constant development of immunophenotyping techniques in flow cytometry and related software, this work aims at providing useful guidelines to perform the most pertinent exploration at diagnosis and for follow-up, with the best cost benefit in diseases, the treatment of which has a strong impact on health systems.
Resumo:
For crime scene investigation in cases of homicide, the pattern of bloodstains at the incident site is of critical importance. The morphology of the bloodstain pattern serves to determine the approximate blood source locations, the minimum number of blows and the positioning of the victim. In the present work, the benefits of the three-dimensional bloodstain pattern analysis, including the ballistic approximation of the trajectories of the blood drops, will be demonstrated using two illustrative cases. The crime scenes were documented in 3D, using the non-contact methods digital photogrammetry, tachymetry and laser scanning. Accurate, true-to-scale 3D models of the crime scenes, including the bloodstain pattern and the traces, were created. For the determination of the areas of origin of the bloodstain pattern, the trajectories of up to 200 well-defined bloodstains were analysed in CAD and photogrammetry software. The ballistic determination of the trajectories was performed using ballistics software. The advantages of this method are the short preparation time on site, the non-contact measurement of the bloodstains and the high accuracy of the bloodstain analysis. It should be expected that this method delivers accurate results regarding the number and position of the areas of origin of bloodstains, in particular the vertical component is determined more precisely than using conventional methods. In both cases relevant forensic conclusions regarding the course of events were enabled by the ballistic bloodstain pattern analysis.
Resumo:
Bite mark analysis offers the opportunity to identify the biter based on the individual characteristics of the dentitions. Normally, the main focus is on analysing bite mark injuries on human bodies, but also, bite marks in food may play an important role in the forensic investigation of a crime. This study presents a comparison of simulated bite marks in different kinds of food with the dentitions of the presumed biter. Bite marks were produced by six adults in slices of buttered bread, apples, different kinds of Swiss chocolate and Swiss cheese. The time-lapse influence of the bite mark in food, under room temperature conditions, was also examined. For the documentation of the bite marks and the dentitions of the biters, 3D optical surface scanning technology was used. The comparison was performed using two different software packages: the ATOS modelling and analysing software and the 3D studio max animation software. The ATOS software enables an automatic computation of the deviation between the two meshes. In the present study, the bite marks and the dentitions were compared, as well as the meshes of each bite mark which were recorded in the different stages of time lapse. In the 3D studio max software, the act of biting was animated to compare the dentitions with the bite mark. The examined food recorded the individual characteristics of the dentitions very well. In all cases, the biter could be identified, and the dentitions of the other presumed biters could be excluded. The influence of the time lapse on the food depends on the kind of food and is shown on the diagrams. However, the identification of the biter could still be performed after a period of time, based on the recorded individual characteristics of the dentitions.
Resumo:
Adjuvant chemotherapy decisions in breast cancer are increasingly based on the pathologist's assessment of tumor proliferation. The Swiss Working Group of Gyneco- and Breast Pathologists has surveyed inter- and intraobserver consistency of Ki-67-based proliferative fraction in breast carcinomas. Methods Five pathologists evaluated MIB-1-labeling index (LI) in ten breast carcinomas (G1, G2, G3) by counting and eyeballing. In the same way, 15 pathologists all over Switzerland then assessed MIB-1-LI on three G2 carcinomas, in self-selected or pre-defined areas of the tumors, comparing centrally immunostained slides with slides immunostained in the different laboratoires. To study intra-observer variability, the same tumors were re-examined 4 months later. Results The Kappa values for the first series of ten carcinomas of various degrees of differentiation showed good to very good agreement for MIB-1-LI (Kappa 0.56–0.72). However, we found very high inter-observer variabilities (Kappa 0.04–0.14) in the read-outs of the G2 carcinomas. It was not possible to explain the inconsistencies exclusively by any of the following factors: (i) pathologists' divergent definitions of what counts as a positive nucleus (ii) the mode of assessment (counting vs. eyeballing), (iii) immunostaining technique, and (iv) the selection of the tumor area in which to count. Despite intensive confrontation of all participating pathologists with the problem, inter-observer agreement did not improve when the same slides were re-examined 4 months later (Kappa 0.01–0.04) and intra-observer agreement was likewise poor (Kappa 0.00–0.35). Conclusion Assessment of mid-range Ki-67-LI suffers from high inter- and intra-observer variability. Oncologists should be aware of this caveat when using Ki-67-LI as a basis for treatment decisions in moderately differentiated breast carcinomas.
Resumo:
Background. No comprehensive systematic review has been published since 1998 about the frequency with which cancer patients use complementary and alternative medicine (CAM). Methods. MEDLINE, AMED, and Embase databases were searched for surveys published until January 2009. Surveys conducted in Australia, Canada, Europe, New Zealand, and the United States with at least 100 adult cancer patients were included. Detailed information on methods and results was independently extracted by 2 reviewers. Methodological quality was assessed using a criteria list developed according to the STROBE guideline. Exploratory random effects metaanalysis and metaregression were applied. Results. Studies from 18 countries (152; >65 000 cancer patients) were included. Heterogeneity of CAM use was high and to some extent explained by differences in survey methods. The combined prevalence for “current use” of CAM across all studies was 40%. The highest was in the United States and the lowest in Italy and the Netherlands. Metaanalysis suggested an increase in CAM use from an estimated 25% in the 1970s and 1980s to more than 32% in the 1990s and to 49% after 2000. Conclusions. The overall prevalence of CAM use found was lower than often claimed. However, there was some evidence that the use has increased considerably over the past years. Therefore, the health care systems ought to implement clear strategies of how to deal with this. To improve the validity and reporting of future surveys, the authors suggest criteria for methodological quality that should be fulfilled and reporting standards that should be required.
Resumo:
The relative abundance of the heavy water isotopologue HDO provides a deeper insight into the atmospheric hydrological cycle. The SCanning Imaging Absorption spectroMeter for Atmospheric CartograpHY (SCIAMACHY) allows for global retrievals of the ratio HDO/H2O in the 2.3 micron wavelength range. However, the spectroscopy of water lines in this region remains a large source of uncertainty for these retrievals. We therefore evaluate and improve the water spectroscopy in the range 4174–4300 cm−1 and test if this reduces systematic uncertainties in the SCIAMACHY retrievals of HDO/H2O. We use a laboratory spectrum of water vapour to fit line intensity, air broadening and wavelength shift parameters. The improved spectroscopy is tested on a series of ground-based high resolution FTS spectra as well as on SCIAMACHY retrievals of H2O and the ratio HDO/H2O. We find that the improved spectroscopy leads to lower residuals in the FTS spectra compared to HITRAN 2008 and Jenouvrier et al. (2007) spectroscopy, and the retrievals become more robust against changes in the retrieval window. For both the FTS and SCIAMACHY measurements, the retrieved total H2O columns decrease by 2–4% and we find a negative shift of the HDO/H2O ratio, which for SCIAMACHY is partly compensated by changes in the retrieval setup and calibration software. The updated SCIAMACHY HDO/H2O product shows somewhat steeper latitudinal and temporal gradients and a steeper Rayleigh distillation curve, strengthening previous conclusions that current isotope-enabled general circulation models underestimate the variability in the near-surface HDO/H2O ratio.
Resumo:
OBJECTIVES: One main problem occurring after bone grafting is resorption, leading to insufficient bone volume and quality, and may subsequently cause dental implant failure. Comparison of graft volume and bone density of iliac crest and calvarial transplants determined by animal studies demonstrates significantly lower resorption of bone grafts harvested from the skull. This paper is the first clinical study evaluating bone volume and density changes of calvarial split bone grafts after alveolar ridge reconstruction. MATERIAL AND METHODS: Bone volume and density were determined using CT scans and the software program Dicom Works in a total of 51 calvarial grafts after alveolar ridge augmentation in 15 patients. CT scans were taken in all 15 patients immediately after grafting (T0) and before implantation after a postoperative period of 6 months (T1). In five patients (26 calvarial grafts), a 1-year follow-up was performed (T2). RESULTS: A mean volume reduction of 16.2% at T1 (15 patients) and 19.2% at T2 (five patients) was observed. Bone density was high--about 1000 Hounsfield units--and did not change during the 1-year period. At the time of implantation, 41 transplants were classified as quality 1 bone and 10 as quality 2-3 bone. Grafting area and the technique used for grafting (inlay or onlay graft) did not affect the postoperative bone volume reduction. Generalized osteoporosis did not increase the resorption rate of calvarial transplants. CONCLUSION: Based on these findings, calvarial split bone grafts are a promising alternative for alveolar ridge reconstruction in dental implantology.
Resumo:
OBJECTIVE: To compare the content covered by twelve obesity-specific health status measures using the International Classification of Functioning, Disability and Health (ICF). DESIGN: Obesity-specific health status measures were identified and then linked to the ICF separately by two trained health professionals according to standardized guidelines. The degree of agreement between health professionals was calculated by means of the kappa (kappa) statistic. Bootstrapped confidence intervals (CI) were calculated. The obesity-specific health-status measures were compared on the component and category level of the ICF. MEASUREMENTS: welve condition-specific health-status measures were identified and included in this study, namely the obesity-related problem scale, the obesity eating problems scale, the obesity-related coping and obesity-related distress questionnaire, the impact of weight on quality of life questionnaire (short version), the health-related quality of life questionnaire, the obesity adjustment survey (short form), the short specific quality of life scale, the obesity-related well-being questionnaire, the bariatric analysis and reporting outcome system, the bariatric quality of life index, the obesity and weight loss quality of life questionnaire and the weight-related symptom measure. RESULTS: In the 280 items of the eight measures, a total of 413 concepts were identified and linked to the 87 different ICF categories. The measures varied strongly in the number of concepts contained and the number of ICF categories used to map these concepts. Items on body functions varied form 12% in the obesity-related problem scale to 95% in the weight-related symptom measure. The estimated kappa coefficients ranged between 0.79 (CI: 0.72, 0.86) at the component ICFs level and 0.97 (CI: 0.93, 1.0) at the third ICF's level. CONCLUSION: The ICF proved highly useful for the content comparison of obesity-specific health-status measures. The results may provide clinicians and researchers with new insights when selecting health-status measures for clinical studies in obesity.
Resumo:
BACKGROUND: Randomized controlled trials (RCTs) are the best tool to evaluate the effectiveness of clinical interventions. The Consolidated Standards for Reporting Trials (CONSORT) statement was introduced in 1996 to improve reporting of RCTs. We aimed to determine the extent of ambiguity and reporting quality as assessed by adherence to the CONSORT statement in published reports of RCTs involving patients with Hodgkin lymphoma from 1966 through 2002. METHODS: We analyzed 242 published full-text reports of RCTs in patients with Hodgkin lymphoma. Quality of reporting was assessed using a 14-item questionnaire based on the CONSORT checklist. Reporting was studied in two pre-CONSORT periods (1966-1988 and 1989-1995) and one post-CONSORT period (1996-2002). RESULTS: Only six of the 14 items were addressed in 75% or more of the studies in all three time periods. Most items that are necessary to assess the methodologic quality of a study were reported by fewer than 20% of the studies. Improvements over time were seen for some items, including the description of statistics methods used, reporting of primary research outcomes, performance of power calculations, method of randomization and concealment allocation, and having performed intention-to-treat analysis. CONCLUSIONS: Despite recent improvements, reporting levels of CONSORT items in RCTs involving patients with Hodgkin lymphoma remain unsatisfactory. Further concerted action by journal editors, learned societies, and medical schools is necessary to make authors even more aware of the need to improve the reporting RCTs in medical journals to allow assessment of validity of published clinical research.
Resumo:
The purpose of this research was to develop a working physical model of the focused plenoptic camera and develop software that can process the measured image intensity, reconstruct this into a full resolution image, and to develop a depth map from its corresponding rendered image. The plenoptic camera is a specialized imaging system designed to acquire spatial, angular, and depth information in a single intensity measurement. This camera can also computationally refocus an image by adjusting the patch size used to reconstruct the image. The published methods have been vague and conflicting, so the motivation behind this research is to decipher the work that has been done in order to develop a working proof-of-concept model. This thesis outlines the theory behind the plenoptic camera operation and shows how the measured intensity from the image sensor can be turned into a full resolution rendered image with its corresponding depth map. The depth map can be created by a cross-correlation of adjacent sub-images created by the microlenslet array (MLA.) The full resolution image reconstruction can be done by taking a patch from each MLA sub-image and piecing them together like a puzzle. The patch size determines what object plane will be in-focus. This thesis also goes through a very rigorous explanation of the design constraints involved with building a plenoptic camera. Plenoptic camera data from Adobe © was used to help with the development of the algorithms written to create a rendered image and its depth map. Finally, using the algorithms developed from these tests and the knowledge for developing the plenoptic camera, a working experimental system was built, which successfully generated a rendered image and its corresponding depth map.