6 resultados para System Accounting Standards

em AMS Tesi di Laurea - Alm@DL - Università di Bologna


Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of this dissertation is to provide a translation in English of the Notes on the Consolidated Financial Statements of MNLG S.r.l., holding company of the Italian Sorma Group. This translation work is one example of the technical material produced in accordance with the project called Language Toolkit, set up by the Chamber of Commerce of Forlì-Cesena, to support the internationalization of the companies established in the territory. This initiative has represented a unique opportunity for me to put into practice the knowledge and abilities learnt in the translation field during these years at university. It also allowed me to give a concrete purpose to my dissertation, that is to provide a technical document translated into a foreign language. By making its Consolidated Financial Statement readily available in English, the company MNLG S.r.l. can in fact increase the number of possible investors and guarantee a more transparent financial informative to its shareholders. This translation work is divided into six chapters: the first one describes the project, its main objectives and the ways in which it was developed. The second chapter deals with the notions of Consolidated Financial Statements and presents the accounting documents of which the Financial Statements are made up as well as the norms according to which they are prepared. The third chapter, instead, focuses on the translation procedure applied and especially on the documentation process, analysing the differences between the International Accounting Standards and the accounting standards used in Italy. The fourth chapter provides a description of the translation resources built for the translation of this specific document. The fifth chapter includes the English version of the Notes on the Consolidated Financial Statements and, to conclude, the sixth chapter analyses the difficulties encountered in translating and the strategies adopted to overcome them.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Our generation of computational scientists is living in an exciting time: not only do we get to pioneer important algorithms and computations, we also get to set standards on how computational research should be conducted and published. From Euclid’s reasoning and Galileo’s experiments, it took hundreds of years for the theoretical and experimental branches of science to develop standards for publication and peer review. Computational science, rightly regarded as the third branch, can walk the same road much faster. The success and credibility of science are anchored in the willingness of scientists to expose their ideas and results to independent testing and replication by other scientists. This requires the complete and open exchange of data, procedures and materials. The idea of a “replication by other scientists” in reference to computations is more commonly known as “reproducible research”. In this context the journal “EAI Endorsed Transactions on Performance & Modeling, Simulation, Experimentation and Complex Systems” had the exciting and original idea to make the scientist able to submit simultaneously the article and the computation materials (software, data, etc..) which has been used to produce the contents of the article. The goal of this procedure is to allow the scientific community to verify the content of the paper, reproducing it in the platform independently from the OS chosen, confirm or invalidate it and especially allow its reuse to reproduce new results. This procedure is therefore not helpful if there is no minimum methodological support. In fact, the raw data sets and the software are difficult to exploit without the logic that guided their use or their production. This led us to think that in addition to the data sets and the software, an additional element must be provided: the workflow that relies all of them.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In these last years, systems engineering has became one of the major research domains. The complexity of systems has increased constantly and nowadays Cyber-Physical Systems (CPS) are a category of particular interest: these, are systems composed by a cyber part (computer-based algorithms) that monitor and control some physical processes. Their development and simulation are both complex due to the importance of the interaction between the cyber and the physical entities: there are a lot of models written in different languages that need to exchange information among each other. Normally people use an orchestrator that takes care of the simulation of the models and the exchange of informations. This orchestrator is developed manually and this is a tedious and long work. Our proposition is to achieve to generate the orchestrator automatically through the use of Co-Modeling, i.e. by modeling the coordination. Before achieving this ultimate goal, it is important to understand the mechanisms and de facto standards that could be used in a co-modeling framework. So, I studied the use of a technology employed for co-simulation in the industry: FMI. In order to better understand the FMI standard, I realized an automatic export, in the FMI format, of the models realized in an existing software for discrete modeling: TimeSquare. I also developed a simple physical model in the existing open source openmodelica tool. Later, I started to understand how works an orchestrator, developing a simple one: this will be useful in future to generate an orchestrator automatically.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis studies the state-of-the-art of phasor measurement units (PMUs) as well as their metrological requirements stated in the IEEE C37.118.1 and C37.118.2 Standards for guaranteeing correct measurement performances. Communication systems among PMUs and their possible applicability in the field of power quality (PQ) assessment are also investigated. This preliminary study is followed by an analysis of the working principle of real-time (RT) simulators and the importance of hardware-in-the-loop (HIL) implementation, examining the possible case studies specific for PMUs, including compliance tests which are one of the most important parts. The core of the thesis is focused on the implementation of a PMU model in the IEEE 5-bus network in Simulink and in the validation of the results using OPAL RT-4510 as a real-time simulator. An initial check allows one to get an idea about the goodness of the results in Simulink, comparing the PMU data with respect to the load-flow steady-state information. In this part, accuracy indices are also calculated for both voltage and current synchrophasors. The following part consists in the implementation of the same code in OPAL-RT 4510 simulator, after which an initial analysis is carried out in a qualitative way in order to get a sense of the goodness of the outcomes. Finally, the confirmation of the results is based on an examination of the attained voltage and current synchrophasors and accuracy indices coming from Simulink models and from OPAL system, using a Matlab script. This work also proposes suggestions for an upcoming operation of PMUs in a more complex system as the Digital Twin (DT) in order to improve the performances of the already-existing protection devices of the distribution system operator (DSO) for a future enhancement of power systems reliability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thesis is focused on introducing basic MIMO-based and Massive MIMO-based systems and their possible benefits. Then going through the implementation options that we have, according to 3GPP standards, for 5G systems and how the transition is done from a non-standalone 5G RAN to a completely standalone 5G RAN. Having introduced the above-mentioned subjects and providing some definition of telecommunications principles, we move forward to a more technical analysis of the Capacity, Throughput, Power consumption, and Costs. Comparing all the mentioned parameters between a Massive-MIMO-based system and a MIMO-based system. In the analysis of power consumption and costs, we also introduce the concept of virtualization and its benefits in terms of both power and costs. Finally, we try to justify a trade-off between having a more reliable system with a high capacity and throughput while keeping the costs as low as possible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

I contenuti e i servizi offerti dal Web hanno subito negli anni una costante e continua evoluzione, dovuti alla maggiore disponibilità di dispositivi in grado di navigarlo. Oggi i requisiti e le aspettative che gli utenti hanno nei confronti delle applicazioni Web sono sempre maggiori, desiderano un accesso ai contenuti sempre più rapido, interfacce semplici e facili da usare, oltre che reattive, e che tali contenuti siano accessibili da una vasta gamma di dispositivi che presentino sempre più funzionalità. Le aziende devono essere pronte a rispondere a queste esigenze e a fornire agli utenti finali la miglior esperienza possibile, rimanendo aggiornati sulle tecnologie per la creazione di applicazioni Web. Questo è ancora più importante per un'azienda possiede più prodotti, sviluppati da team diversi che usano tecnologie diverse. Per alcune aziende è importante che i propri prodotti, sebbene trattino tematiche differenti, si presentino con interfacce che rimandino al proprio marchio, non solo grazie al nome o al logo, quanto più nei componenti utilizzati per creare le interfacce. Succede così che i vari team devono progettare e sviluppare i componenti nella propria tecnologia, in modo che abbiano le stesse funzionalità, stesso stile e stesso comportamento in ogni situazione. Il più delle volte questo è difficile da realizzare e anche costoso da mantenere. Riuscire a centralizzare lo sviluppo di questi elementi in un unico punto aiuta l'azienda a mantenere bassi i costi di manutenzione e a rendere omogenea l'esperienza degli utenti tra i vari prodotti. Obiettivo del lavoro svolto è illustrare le potenzialità e l'utilità fornite dall'introduzione di una suite di componenti personalizzati, seguendo lo standard dei Web Component, all'interno dei prodotti forniti da una grande impresa. L'analisi si concentra sulll'esperienza di chi utilizza tali componenti all'interno dei propri progetti per creare l'interfaccia utente da presentare poi agli utenti finali.