791 resultados para Computing methodologies
Resumo:
Nel ramo della Information Tecnology, recentemente, nascono sistemi informativi adibiti alla gestione di risorse hardware e software distribuite e visualizzate in rete. Uno degli strumenti più utilizzati e commercializzati per l'utilizzo di tale tipo di tecnologie è rappresentato dal cloud computing. Secondo una ricerca del "Il Sole 24 Ore'' in Italia il 25% delle aziende italiane intende adottare il cloud nei prossimi 12 mesi. E' un mercato da 287 milioni di euro nel 2011, +41% sul 2010, e passerà a 394 milioni nel 2012 per poi risalire a 671 nel 2014. Questa tesi si basa su un lavoro di ricerca precedentemente alla stessa in cui ho esaminato esperienze aziendali o riflessioni di queste ultime sull'applicazione e l'utilizzo della tecnologia cloud come modello di business. Il lavoro si è svolto leggendo ed analizzando due quotidiani italiani (Il Corriere della Sera e il Il Sole 24 Ore), un quotidiano inglese (Financial Times) e un settimanale londinese (The Economist) nell'arco di due anni a questa parte. Attraverso l'analisi degli articoli ottenuti è stata redatta una sintesi degli stessi pervenendo ad una riflessione che ha rappresentato lo spunto di tale tesi. Spesso si discuteva di problemi legati al cloud ma solo in pochi articoli vi era presente una vera e propria case history con analisi di eventuali difficoltà o benefici riscontrati. Da questo l'inizio di tale attività che pone l'obbiettivo di capire, in parte, il perché di così tanta riluttanza verso uno strumento che sembra rappresentare la scelta tecnologicamente più appropriata e strategicamente ottimale. Il cuore della ricerca è rappresentato dalle interviste svolte ad alcune aziende in merito all'utilizzo della "nuvola'' nel loro sistema informatico. Questa tesi si suddividerà: -Descrizione storica della nascita e dello sviluppo del cloud computing -Analisi delle tecnologie attualmente esistenti e dei modelli di distribuzione -Opportunità e minacce legate all'utilizzo di tale tecnologia in un ambiente aziendale -Studio ed analisi di alcuni casi aziendali e del ruolo che svolge l'uso del cloud nel proprio modello di business -Valutazione dell'attuale situazione del cloud computing e delle prospettive future legate all'utilizzo della tecnologia in analisi
Resumo:
Microprocessori basati su singolo processore (CPU), hanno visto una rapida crescita di performances ed un abbattimento dei costi per circa venti anni. Questi microprocessori hanno portato una potenza di calcolo nell’ordine del GFLOPS (Giga Floating Point Operation per Second) sui PC Desktop e centinaia di GFLOPS su clusters di server. Questa ascesa ha portato nuove funzionalità nei programmi, migliori interfacce utente e tanti altri vantaggi. Tuttavia questa crescita ha subito un brusco rallentamento nel 2003 a causa di consumi energetici sempre più elevati e problemi di dissipazione termica, che hanno impedito incrementi di frequenza di clock. I limiti fisici del silicio erano sempre più vicini. Per ovviare al problema i produttori di CPU (Central Processing Unit) hanno iniziato a progettare microprocessori multicore, scelta che ha avuto un impatto notevole sulla comunità degli sviluppatori, abituati a considerare il software come una serie di comandi sequenziali. Quindi i programmi che avevano sempre giovato di miglioramenti di prestazioni ad ogni nuova generazione di CPU, non hanno avuto incrementi di performance, in quanto essendo eseguiti su un solo core, non beneficiavano dell’intera potenza della CPU. Per sfruttare appieno la potenza delle nuove CPU la programmazione concorrente, precedentemente utilizzata solo su sistemi costosi o supercomputers, è diventata una pratica sempre più utilizzata dagli sviluppatori. Allo stesso tempo, l’industria videoludica ha conquistato una fetta di mercato notevole: solo nel 2013 verranno spesi quasi 100 miliardi di dollari fra hardware e software dedicati al gaming. Le software houses impegnate nello sviluppo di videogames, per rendere i loro titoli più accattivanti, puntano su motori grafici sempre più potenti e spesso scarsamente ottimizzati, rendendoli estremamente esosi in termini di performance. Per questo motivo i produttori di GPU (Graphic Processing Unit), specialmente nell’ultimo decennio, hanno dato vita ad una vera e propria rincorsa alle performances che li ha portati ad ottenere dei prodotti con capacità di calcolo vertiginose. Ma al contrario delle CPU che agli inizi del 2000 intrapresero la strada del multicore per continuare a favorire programmi sequenziali, le GPU sono diventate manycore, ovvero con centinaia e centinaia di piccoli cores che eseguono calcoli in parallelo. Questa immensa capacità di calcolo può essere utilizzata in altri campi applicativi? La risposta è si e l’obiettivo di questa tesi è proprio quello di constatare allo stato attuale, in che modo e con quale efficienza pùo un software generico, avvalersi dell’utilizzo della GPU invece della CPU.
Resumo:
Throughout the twentieth century statistical methods have increasingly become part of experimental research. In particular, statistics has made quantification processes meaningful in the soft sciences, which had traditionally relied on activities such as collecting and describing diversity rather than timing variation. The thesis explores this change in relation to agriculture and biology, focusing on analysis of variance and experimental design, the statistical methods developed by the mathematician and geneticist Ronald Aylmer Fisher during the 1920s. The role that Fisher’s methods acquired as tools of scientific research, side by side with the laboratory equipment and the field practices adopted by research workers, is here investigated bottom-up, beginning with the computing instruments and the information technologies that were the tools of the trade for statisticians. Four case studies show under several perspectives the interaction of statistics, computing and information technologies, giving on the one hand an overview of the main tools – mechanical calculators, statistical tables, punched and index cards, standardised forms, digital computers – adopted in the period, and on the other pointing out how these tools complemented each other and were instrumental for the development and dissemination of analysis of variance and experimental design. The period considered is the half-century from the early 1920s to the late 1960s, the institutions investigated are Rothamsted Experimental Station and the Galton Laboratory, and the statisticians examined are Ronald Fisher and Frank Yates.
Resumo:
This study focuses on the use of metabonomics applications in measuring fish freshness in various biological species and in evaluating how they are stored. This metabonomic approach is innovative and is based upon molecular profiling through nuclear magnetic resonance (NMR). On one hand, the aim is to ascertain if a type of fish has maintained, within certain limits, its sensory and nutritional characteristics after being caught; and on the second, the research observes the alterations in the product’s composition. The spectroscopic data obtained through experimental nuclear magnetic resonance, 1H-NMR, of the molecular profiles of the fish extracts are compared with those obtained on the same samples through analytical and conventional methods now in practice. These second methods are used to obtain chemical indices of freshness through biochemical and microbial degradation of the proteic nitrogen compounds and not (trimethylamine, N-(CH3)3, nucleotides, amino acids, etc.). At a later time, a principal components analysis (PCA) and a linear discriminant analysis (PLS-DA) are performed through a metabonomic approach to condense the temporal evolution of freshness into a single parameter. In particular, the first principal component (PC1) under both storage conditions (4 °C and 0 °C) represents the component together with the molecular composition of the samples (through 1H-NMR spectrum) evolving during storage with a very high variance. The results of this study give scientific evidence supporting the objective elements evaluating the freshness of fish products showing those which can be labeled “fresh fish.”
Resumo:
MFA and LCA methodologies were applied to analyse the anthropogenic aluminium cycle in Italy with focus on historical evolution of stocks and flows of the metal, embodied GHG emissions, and potentials from recycling to provide key features to Italy for prioritizing industrial policy toward low-carbon technologies and materials. Historical trend series were collected from 1947 to 2009 and balanced with data from production, manufacturing and waste management of aluminium-containing products, using a ‘top-down’ approach to quantify the contemporary in-use stock of the metal, and helping to identify ‘applications where aluminium is not yet being recycled to its full potential and to identify present and future recycling flows’. The MFA results were used as a basis for the LCA aimed at evaluating the carbon footprint evolution, from primary and electrical energy, the smelting process and the transportation, embodied in the Italian aluminium. A discussion about how the main factors, according to the Kaya Identity equation, they did influence the Italian GHG emissions pattern over time, and which are the levers to mitigate it, it has been also reported. The contemporary anthropogenic reservoirs of aluminium was estimated at about 320 kg per capita, mainly embedded within the transportation and building and construction sectors. Cumulative in-use stock represents approximately 11 years of supply at current usage rates (about 20 Mt versus 1.7 Mt/year), and it would imply a potential of about 160 Mt of CO2eq emissions savings. A discussion of criticality related to aluminium waste recovery from the transportation and the containers and packaging sectors was also included in the study, providing an example for how MFA and LCA may support decision-making at sectorial or regional level. The research constitutes the first attempt of an integrated approach between MFA and LCA applied to the aluminium cycle in Italy.
Resumo:
This thesis reports an integrated analytical approach for the study of physicochemical and biological properties of new synthetic bile acid (BA) analogues agonists of FXR and TGR5 receptors. Structure-activity data were compared with those previous obtained using the same experimental protocols on synthetic and natural occurring BA. The new synthetic BA analogues are classified in different groups according also to their potency as a FXR and TGR5 agonists: unconjugated and steroid modified BA and side chain modified BA including taurine or glycine conjugates and pseudo-conjugates (sulphonate and sulphate analogues). In order to investigate the relationship between structure and activity the synthetic analogues where admitted to a physicochemical characterization and to a preliminary screening for their pharmacokinetic and metabolism using a bile fistula rat model. Sensitive and accurate analytical methods have been developed for the quali-quantitative analysis of BA in biological fluids and sample used for physicochemical studies. Combined High Performance Liquid Chromatography Electrospray tandem mass spectrometry with efficient chromatographic separation of all studied BA and their metabolites have been optimized and validated. Analytical strategies for the identification of the BA and their minor metabolites have been developed. Taurine and glycine conjugates were identified in MS/MS by monitoring the specific ion transitions in multiple reaction monitoring (MRM) mode while all other metabolites (sulphate, glucuronic acid, dehydroxylated, decarboxylated or oxo) were monitored in a selected-ion reaction (SIR) mode with a negative ESI interface by the following ions. Accurate and precise data where achieved regarding the main physicochemical properties including solubility, detergency, lipophilicity and albumin binding . These studies have shown that minor structural modification greatly affect the pharmacokinetics and metabolism of the new analogues in respect to the natural BA and on turn their site of action, particularly where their receptor are located in the enterohepatic circulation.
Resumo:
This study was aimed to correlate the results of relative germination from in vitro tests by trifloxystrobin with those of qPCR on a wide range of V. inaequalis populations and monoconidial isolates. Samples were collected in Italian and Turkish distinct locations from orchards with different scab management. In this study, an allele-specific qPCR with primer sets designed was successfully developed to quantitatively determine the frequency of QoI-resistant allele G143A in populations and monoconidial isolates of V. inaequalis. qPCR followed a similar pattern to that obtained using in vitro conidial germination test in very sensitive and very resistant populations. However, the variability between two test results was observed in hetereogenous populations. Therefore, the results of correlations between in vitro and qPCR showed a positive but not very high correlation for Venturia inaequalis populations (R2=0.70). On the contrary, this correlation between two assessment methods was very high for monoconidial isolates (R2=0.92). Results obtained in quantitative PCR and from traditional spore germination assay differed for the same fungal population and in some cases, it is difficult to assess the resistance in the field by only qPCR. It was concluded that it is not always possible to correlate the frequency of detection of the mutation with biological assessment. In such situations, monitoring by molecular techniques must be supported by standard in vitro resistance assessments and observation of field performance in order to have correct conclusions.
Resumo:
Nel presente lavoro, partendo dalla definizione di alcuni punti chiavi del concetto di cloud computing, si è insistito molto sulle problematiche relative alle performance degli ambenti cloud, e alle diverse proposte attualmente presenti sul mercato con i relativi limiti. Dopo averle illustrate in modo dettagliato, le diverse proposte sono state tra loro messe a confronto al fine di evidenziare, per ciascuna di essa, tanto gli aspetti positivi quanto i punti di criticità.
Resumo:
Viene analizzato il Cloud Computing, il suo utilizzo, i vari tipi di modelli di servizio. L'attenzione poi vira sul SLA (Service Level Agreement), contratti stipulati tra il provider e l'utente affinchè il servizio venga utilizzato al meglio e in modo sicuro rispettando le norme.Infine vengono analizzati la sicurezza, la privacy e l'accountability nel Cloud.
Resumo:
This thesis deals with heterogeneous architectures in standard workstations. Heterogeneous architectures represent an appealing alternative to traditional supercomputers because they are based on commodity components fabricated in large quantities. Hence their price-performance ratio is unparalleled in the world of high performance computing (HPC). In particular, different aspects related to the performance and consumption of heterogeneous architectures have been explored. The thesis initially focuses on an efficient implementation of a parallel application, where the execution time is dominated by an high number of floating point instructions. Then the thesis touches the central problem of efficient management of power peaks in heterogeneous computing systems. Finally it discusses a memory-bounded problem, where the execution time is dominated by the memory latency. Specifically, the following main contributions have been carried out: A novel framework for the design and analysis of solar field for Central Receiver Systems (CRS) has been developed. The implementation based on desktop workstation equipped with multiple Graphics Processing Units (GPUs) is motivated by the need to have an accurate and fast simulation environment for studying mirror imperfection and non-planar geometries. Secondly, a power-aware scheduling algorithm on heterogeneous CPU-GPU architectures, based on an efficient distribution of the computing workload to the resources, has been realized. The scheduler manages the resources of several computing nodes with a view to reducing the peak power. The two main contributions of this work follow: the approach reduces the supply cost due to high peak power whilst having negligible impact on the parallelism of computational nodes. from another point of view the developed model allows designer to increase the number of cores without increasing the capacity of the power supply unit. Finally, an implementation for efficient graph exploration on reconfigurable architectures is presented. The purpose is to accelerate graph exploration, reducing the number of random memory accesses.
Resumo:
L'elaborato introduce i concetti di Big Data, Cloud Computing e le tipologie di paradigmi basati sul calcolo parallelo. Trasformando questi concetti in pratica tramite un caso di studio sui Big Data. Nell'elaborato si spiega l'architettura proposta per l'elaborazione di report in formato pdf. Analaizando in fine i risultati ottenuti.
Resumo:
L'obbiettivo che ci poniamo con questa tesi è quello di esplorare il mondo del Cloud Computing, cercando di capire le principali caratteristiche architetturali e vedere in seguito i componenti fondamentali che si occupano di trasformare una infrastruttura informatica in un'infrastruttura cloud, ovvero i Cloud Operating System.
Resumo:
This PhD thesis discusses the impact of Cloud Computing infrastructures on Digital Forensics in the twofold role of target of investigations and as a helping hand to investigators. The Cloud offers a cheap and almost limitless computing power and storage space for data which can be leveraged to commit either new or old crimes and host related traces. Conversely, the Cloud can help forensic examiners to find clues better and earlier than traditional analysis applications, thanks to its dramatically improved evidence processing capabilities. In both cases, a new arsenal of software tools needs to be made available. The development of this novel weaponry and its technical and legal implications from the point of view of repeatability of technical assessments is discussed throughout the following pages and constitutes the unprecedented contribution of this work