665 resultados para BENCHMARKING
Resumo:
All'interno di Vodafone Italia è stato seguito l'intero processo di riorganizzazione del dipartimento di Financial Accounting. Le attività che hanno portato la transizione dalla vecchia alla nuova organizzazione sono: - analisi dei processi di business - intervista clienti interni - benchmarking - monitoraggio KPIs organizzativo - disegno organizzativo - piano di comunicazione - piano di transizione
Resumo:
This thesis presents new methods to simulate systems with hydrodynamic and electrostatic interactions. Part 1 is devoted to computer simulations of Brownian particles with hydrodynamic interactions. The main influence of the solvent on the dynamics of Brownian particles is that it mediates hydrodynamic interactions. In the method, this is simulated by numerical solution of the Navier--Stokes equation on a lattice. To this end, the Lattice--Boltzmann method is used, namely its D3Q19 version. This model is capable to simulate compressible flow. It gives us the advantage to treat dense systems, in particular away from thermal equilibrium. The Lattice--Boltzmann equation is coupled to the particles via a friction force. In addition to this force, acting on {it point} particles, we construct another coupling force, which comes from the pressure tensor. The coupling is purely local, i.~e. the algorithm scales linearly with the total number of particles. In order to be able to map the physical properties of the Lattice--Boltzmann fluid onto a Molecular Dynamics (MD) fluid, the case of an almost incompressible flow is considered. The Fluctuation--Dissipation theorem for the hybrid coupling is analyzed, and a geometric interpretation of the friction coefficient in terms of a Stokes radius is given. Part 2 is devoted to the simulation of charged particles. We present a novel method for obtaining Coulomb interactions as the potential of mean force between charges which are dynamically coupled to a local electromagnetic field. This algorithm scales linearly, too. We focus on the Molecular Dynamics version of the method and show that it is intimately related to the Car--Parrinello approach, while being equivalent to solving Maxwell's equations with freely adjustable speed of light. The Lagrangian formulation of the coupled particles--fields system is derived. The quasi--Hamiltonian dynamics of the system is studied in great detail. For implementation on the computer, the equations of motion are discretized with respect to both space and time. The discretization of the electromagnetic fields on a lattice, as well as the interpolation of the particle charges on the lattice is given. The algorithm is as local as possible: Only nearest neighbors sites of the lattice are interacting with a charged particle. Unphysical self--energies arise as a result of the lattice interpolation of charges, and are corrected by a subtraction scheme based on the exact lattice Green's function. The method allows easy parallelization using standard domain decomposition. Some benchmarking results of the algorithm are presented and discussed.
Resumo:
Recent developments in the theory of plasma-based collisionally excited x-ray lasers (XRL) have shown an optimization potential based on the dependence of the absorption region of the pumping laser on its angle of incidence on the plasma. For the experimental proof of this idea, a number of diagnostic schemes were developed, tested, qualified and applied. A high-resolution imaging system, yielding the keV emission profile perpendicular to the target surface, provided positions of the hottest plasma regions, interesting for the benchmarking of plasma simulation codes. The implementation of a highly efficient spectrometer for the plasma emission made it possible to gain information about the abundance of the ionization states necessary for the laser action in the plasma. The intensity distribution and deflection angle of the pump laser beam could be imaged for single XRL shots, giving access to its refraction process within the plasma. During a European collaboration campaign at the Lund Laser Center, Sweden, the optimization of the pumping laser incidence angle resulted in a reduction of the required pumping energy for a Ni-like Mo XRL, which enabled the operation at a repetition rate of 10 Hz. Using the experiences gained there, the XRL performance at the PHELIX facility, GSI Darmstadt with respect to achievable repetition rate and at wavelengths below 20 nm was significantly improved, and also important information for the development towards multi-100 eV plasma XRLs was acquired. Due to the setup improvements achieved during the work for this thesis, the PHELIX XRL system now has reached a degree of reproducibility and versatility which is sufficient for demanding applications like the XRL spectroscopy of heavy ions. In addition, a European research campaign, aiming towards plasma XRLs approaching the water-window (wavelengths below 5 nm) was initiated.
Resumo:
China is a large country characterized by remarkable growth and distinct regional diversity. Spatial disparity has always been a hot issue since China has been struggling to follow a balanced growth path but still confronting with unprecedented pressures and challenges. To better understand the inequality level benchmarking spatial distributions of Chinese provinces and municipalities and estimate dynamic trajectory of sustainable development in China, I constructed the Composite Index of Regional Development (CIRD) with five sub pillars/dimensions involving Macroeconomic Index (MEI), Science and Innovation Index (SCI), Environmental Sustainability Index (ESI), Human Capital Index (HCI) and Public Facilities Index (PFI), endeavoring to cover various fields of regional socioeconomic development. Ranking reports on the five sub dimensions and aggregated CIRD were provided in order to better measure the developmental degrees of 31 or 30 Chinese provinces and municipalities over 13 years from 1998 to 2010 as the time interval of three “Five-year Plans”. Further empirical applications of this CIRD focused on clustering and convergence estimation, attempting to fill up the gap in quantifying the developmental levels of regional comprehensive socioeconomics and estimating the dynamic convergence trajectory of regional sustainable development in a long run. Four clusters were benchmarked geographically-oriented in the map on the basis of cluster analysis, and club-convergence was observed in the Chinese provinces and municipalities based on stochastic kernel density estimation.
Resumo:
L’obiettivo di questa tesi è quello di mettere a confronto due mondi: quello dei DBMS relazionali e quello dei DBMS a grafo, con lo scopo di comprendere meglio quest'ultimo. Perciò, sono state scelte le due tecnologie che meglio rappresentano i loro mondi: Oracle per gli RDBMS e Neo4j per i Graph DBMS. I due DBMS sono stati sottoposti ad una serie di interrogazioni atte a testare le performance al variare di determinati fattori, come la selettività, il numero di join che Oracle effettua, etc. I test svolti si collocano nell'ambito business intelligence e in particolare in quello dell’analisi OLAP.
Resumo:
Valutazione dell’opportunità di fare trading elettrico nell’area CEE. Progetto di analisi di mercato, per l’azienda “C.U.Ra”, operatore del mercato elettrico, sull’opportunità di fare business energetico in Polonia e paesi CEE. Attività sviluppate: •Benchmarking dei mercati evoluti dell’energia •Analisi del mercato elettrico italiano •Analisi dell’attività di trading elettrico: Profit & Loss, Risk Management. •Analisi di Fattibilità •Analisi di sensibilità (What-if): variazione parametrica dell’opportunità • Avviamento del business e partnership con società di consulting polacca
Resumo:
L'assistenza sanitaria in Italia e nel mondo è caratterizzata da bisogni elevati in continua crescita, ad essi si contrappone l'attuale crisi delle risorse economiche determinando per il Sistema una valutazione di scelte quali la riduzione o la rimodulazione dell'offerta sanitaria pubblica. L'idea di questo lavoro, nata all'interno dell'Istituto Scientifico Romagnolo per lo Studio e la Cura dei Tumori (IRST) IRCCS, è di approcciare questo problema in ottica di miglioramento delle performance anziché riduzione dei servizi, nella convinzione che vi siano importanti margini di perfezionamento. Per questi motivi si è valutata la necessità di sviluppare un metodo e un'applicazione software per l'identificazione dei percorsi diagnostici terapeutici assistenziali (PDTA), per la raccolta di dati dalle strutture coinvolte, per l'analisi dei costi e dei risultati, mirando ad una analisi di tipo costi - efficacia e di benchmarking in ottica di presa in carico dei bisogni di salute. La tesi descrive la fase di raccolta e analisi dei requisiti comprensiva della profilazione utenti e descrizione di alcuni aspetti dinamici salienti, la fase di progettazione concettuale (schema Entity/Relationship, glossario e volumi dei dati), la fase di progettazione logica e la prototipazione dell'interfaccia utente. Riporta inoltre una valutazione dei tempi di sviluppo realizzata tramite metodologia di calcolo dei punti per caso d'uso. L'applicazione progettata è oggetto di valutazione di fattibilità presso l'IRST, che ha utilizzato alcune delle metodologie descritte nella tesi per descrivere il percorso di patologia mammaria e presentarne i primi risultati all'interno di un progetto di ricerca in collaborazione con l'Agenzia Nazionale per i Servizi Sanitari Regionali (Agenas).
Resumo:
Moderne ESI-LC-MS/MS-Techniken erlauben in Verbindung mit Bottom-up-Ansätzen eine qualitative und quantitative Charakterisierung mehrerer tausend Proteine in einem einzigen Experiment. Für die labelfreie Proteinquantifizierung eignen sich besonders datenunabhängige Akquisitionsmethoden wie MSE und die IMS-Varianten HDMSE und UDMSE. Durch ihre hohe Komplexität stellen die so erfassten Daten besondere Anforderungen an die Analysesoftware. Eine quantitative Analyse der MSE/HDMSE/UDMSE-Daten blieb bislang wenigen kommerziellen Lösungen vorbehalten. rn| In der vorliegenden Arbeit wurden eine Strategie und eine Reihe neuer Methoden zur messungsübergreifenden, quantitativen Analyse labelfreier MSE/HDMSE/UDMSE-Daten entwickelt und als Software ISOQuant implementiert. Für die ersten Schritte der Datenanalyse (Featuredetektion, Peptid- und Proteinidentifikation) wird die kommerzielle Software PLGS verwendet. Anschließend werden die unabhängigen PLGS-Ergebnisse aller Messungen eines Experiments in einer relationalen Datenbank zusammengeführt und mit Hilfe der dedizierten Algorithmen (Retentionszeitalignment, Feature-Clustering, multidimensionale Normalisierung der Intensitäten, mehrstufige Datenfilterung, Proteininferenz, Umverteilung der Intensitäten geteilter Peptide, Proteinquantifizierung) überarbeitet. Durch diese Nachbearbeitung wird die Reproduzierbarkeit der qualitativen und quantitativen Ergebnisse signifikant gesteigert.rn| Um die Performance der quantitativen Datenanalyse zu evaluieren und mit anderen Lösungen zu vergleichen, wurde ein Satz von exakt definierten Hybridproteom-Proben entwickelt. Die Proben wurden mit den Methoden MSE und UDMSE erfasst, mit Progenesis QIP, synapter und ISOQuant analysiert und verglichen. Im Gegensatz zu synapter und Progenesis QIP konnte ISOQuant sowohl eine hohe Reproduzierbarkeit der Proteinidentifikation als auch eine hohe Präzision und Richtigkeit der Proteinquantifizierung erreichen.rn| Schlussfolgernd ermöglichen die vorgestellten Algorithmen und der Analyseworkflow zuverlässige und reproduzierbare quantitative Datenanalysen. Mit der Software ISOQuant wurde ein einfaches und effizientes Werkzeug für routinemäßige Hochdurchsatzanalysen labelfreier MSE/HDMSE/UDMSE-Daten entwickelt. Mit den Hybridproteom-Proben und den Bewertungsmetriken wurde ein umfassendes System zur Evaluierung quantitativer Akquisitions- und Datenanalysesysteme vorgestellt.
Resumo:
This is the third paper Kresl and Singh have published on this subject. The first was for an OECD conference that was published in 1995. The second was published in Urban Studies in 1999. Hence in this most recent study they can examine urban competitiveness in the US over a period of three decades. Their methodology is distinctive in that it is statistical rather than subjective, as is the case with studies that use a benchmarking or a structural methodology. Their results can be used by city planners in design of a strategic-economic plan. They also capture the major changes in broad regional competitiveness.
Resumo:
Spine Tango is currently the only international spine registry in existence. It was developed under the auspices of Eurospine, the Spine Society of Europe, and is hosted at the University of Bern, Switzerland. The HJD Spine Center successfully tested Spine Tango during a 3-month pilot study and has since expanded documentation activities to more surgeons. Workflow integration and dedicated research staff are key factors for such an endeavor. Participation enables benchmarking against national and international peers and outcome research and quality assurance of surgical and non-surgical treatments.
Resumo:
Software repositories have been getting a lot of attention from researchers in recent years. In order to analyze software repositories, it is necessary to first extract raw data from the version control and problem tracking systems. This poses two challenges: (1) extraction requires a non-trivial effort, and (2) the results depend on the heuristics used during extraction. These challenges burden researchers that are new to the community and make it difficult to benchmark software repository mining since it is almost impossible to reproduce experiments done by another team. In this paper we present the TA-RE corpus. TA-RE collects extracted data from software repositories in order to build a collection of projects that will simplify extraction process. Additionally the collection can be used for benchmarking. As the first step we propose an exchange language capable of making sharing and reusing data as simple as possible.
Resumo:
Currently photon Monte Carlo treatment planning (MCTP) for a patient stored in the patient database of a treatment planning system (TPS) can usually only be performed using a cumbersome multi-step procedure where many user interactions are needed. This means automation is needed for usage in clinical routine. In addition, because of the long computing time in MCTP, optimization of the MC calculations is essential. For these purposes a new graphical user interface (GUI)-based photon MC environment has been developed resulting in a very flexible framework. By this means appropriate MC transport methods are assigned to different geometric regions by still benefiting from the features included in the TPS. In order to provide a flexible MC environment, the MC particle transport has been divided into different parts: the source, beam modifiers and the patient. The source part includes the phase-space source, source models and full MC transport through the treatment head. The beam modifier part consists of one module for each beam modifier. To simulate the radiation transport through each individual beam modifier, one out of three full MC transport codes can be selected independently. Additionally, for each beam modifier a simple or an exact geometry can be chosen. Thereby, different complexity levels of radiation transport are applied during the simulation. For the patient dose calculation, two different MC codes are available. A special plug-in in Eclipse providing all necessary information by means of Dicom streams was used to start the developed MC GUI. The implementation of this framework separates the MC transport from the geometry and the modules pass the particles in memory; hence, no files are used as the interface. The implementation is realized for 6 and 15 MV beams of a Varian Clinac 2300 C/D. Several applications demonstrate the usefulness of the framework. Apart from applications dealing with the beam modifiers, two patient cases are shown. Thereby, comparisons are performed between MC calculated dose distributions and those calculated by a pencil beam or the AAA algorithm. Interfacing this flexible and efficient MC environment with Eclipse allows a widespread use for all kinds of investigations from timing and benchmarking studies to clinical patient studies. Additionally, it is possible to add modules keeping the system highly flexible and efficient.
Resumo:
Detailed knowledge of the characteristics of the radiation field shaped by a multileaf collimator (MLC) is essential in intensity modulated radiotherapy (IMRT). A previously developed multiple source model (MSM) for a 6 MV beam was extended to a 15 MV beam and supplemented with an accurate model of an 80-leaf dynamic MLC. Using the supplemented MSM and the MC code GEANT, lateral dose distributions were calculated in a water phantom and a portal water phantom. A field which is normally used for the validation of the step and shoot technique and a field from a realistic IMRT treatment plan delivered with dynamic MLC are investigated. To assess possible spectral changes caused by the modulation of beam intensity by an MLC, the energy spectra in five portal planes were calculated for moving slits of different widths. The extension of the MSM to 15 MV was validated by analysing energy fluences, depth doses and dose profiles. In addition, the MC-calculated primary energy spectrum was verified with an energy spectrum which was reconstructed from transmission measurements. MC-calculated dose profiles using the MSM for the step and shoot case and for the dynamic MLC case are in very good agreement with the measured data from film dosimetry. The investigation of a 13 cm wide field shows an increase in mean photon energy of up to 16% for the 0.25 cm slit compared to the open beam for 6 MV and of up to 6% for 15 MV, respectively. In conclusion, the MSM supplemented with the dynamic MLC has proven to be a powerful tool for investigational and benchmarking purposes or even for dose calculations in IMRT.
Resumo:
PURPOSE OF REVIEW: Intensive care medicine consumes a high share of healthcare costs, and there is growing pressure to use the scarce resources efficiently. Accordingly, organizational issues and quality management have become an important focus of interest in recent years. Here, we will review current concepts of how outcome data can be used to identify areas requiring action. RECENT FINDINGS: Using recently established models of outcome assessment, wide variability between individual ICUs is found, both with respect to outcome and resource use. Such variability implies that there are large differences in patient care processes not only within the ICU but also in pre-ICU and post-ICU care. Indeed, measures to improve the patient process in the ICU (including care of the critically ill, patient safety, and management of the ICU) have been presented in a number of recently published papers. SUMMARY: Outcome assessment models provide an important framework for benchmarking. They may help the individual ICU to spot appropriate fields of action, plan and initiate quality improvement projects, and monitor the consequences of such activity.
Resumo:
In Panama, one of the Environmental Health (EH) Sector’s primary goals is to improve the health of rural Panamanians by helping them to adopt behaviors and practices that improve access to and use of sanitation systems. In complying with this goal, the EH sector has used participatory development models to improve hygiene and increase access to latrines through volunteer managed latrine construction projects. Unfortunately, there is little understanding of the long term sustainability of these interventions after the volunteers have completed their service. With the Peace Corps adapting their Monitoring, Reporting, and Evaluation procedures, it is appropriate to evaluate the sustainability of sanitation interventions offering recommendations for the adaptions of the EH training program, project management, and evaluation procedures. Recognizing the need for evaluation of past latrine projects, the author performed a post project assessment of 19 pit latrine projects using participatory analysis methodologies. First, the author reviewed volunteers’ perspectives of pit latrine projects in a survey. Then, for comparison, the author performed a survey of latrine projects using a benchmarking scoring system to rate solid waste management, drainage, latrine siting, latrine condition, and hygiene. It was observed that the Sanitation WASH matrix created by the author was an effective tool for evaluating the efficacy of sanitation interventions. Overall more than 75%, of latrines constructed were in use. However, there were some areas where improvements could be made for both latrine construction and health and hygiene. The latrines scored poorly on the indicators related to the privacy structure and seat covers. Interestingly those are the two items least likely to be included in project subsidies. Furthermore, scores for hygiene-related indicators were low; particularly those related to hand washing and cleanliness of the kitchen, indicating potential for improvement in hygiene education. Based on these outcomes, the EH sector should consider including subsidies and standardized designs for privacy structures and seat covers for latrines. In addition, the universal adoption of contracts and/or deposits for project beneficiaries is expected to improve the completion of latrines. In order to address the low scores in the health and hygiene indicators, the EH sector should adapt volunteer training, in addition to standardizing health and hygiene intervention procedures. In doing so, the sector should mimic the Community Health Club model that has shown success in improving health and hygiene indicators, as well as use a training session plan format similar to those in the Water Committee Seminar manual. Finally, the sector should have an experienced volunteer dedicated to program oversight and post-project monitoring and evaluation.