986 resultados para Giuseppe Vasi
Resumo:
How is the notion of public interest operationalised in the regulatory practices of the International Public Sector Accounting Standards Board (IPSASB)? A fundamental objective in setting international accounting standards for both the private and public sector is to serve the ‘public interest’. Who or what constitutes ‘public interest’ however remains a highly complex and controversial issue. Private sector financial reporting research posits that users (of financial information) are used as a proxy for the ‘public’ and users are further refined to current and potential investors - a small proportion of the public. The debates surrounding public interest are even more contentious in public sector financial reporting which deals with ‘public’ (tax payers’) money. In our study we use Bourdieu’s notion of semi-homogenous fields to show how autonomous and heteronomous pressures from the epistemic community of the accounting profession and political/government interests compete for the right to define the public interest and determine how (by what accounting solutions) this interest is best served. This is a theoretical study grounded in the analysis of empirical data from interviews with the board members of the IPSASB. The main contribution of the paper is to further our understanding of the perceptions of the main decision makers from the ‘inner regulatory circle’ with regards to the problematic construct of public interest. The main findings suggest a paternal and un-reflexive attitude of the board members leading to the conclusion that the public have no real voice in these matters.
Resumo:
Little information exists on the effects of ensiling on condensed tannins or proanthocyanidins. The acetone–butanol–HCl assay is suitable for measuring proanthocyanidin contents in a wide range of samples, silages included, but provides limited information on proanthocyanidin composition, which is of interest for deciphering the relationships between tannins and their bioactivities in terms of animal nutrition or health. Degradation with benzyl mercaptan (thiolysis) provides information on proanthocyanidin composition, but proanthocyanidins in several sainfoin silages have proved resistant to thiolysis. We now report that a pretreatment step with sodium hydroxide prior to thiolysis was needed to enable their analysis. This alkaline treatment increased their extractability from ensiled sainfoin and facilitated especially the release of larger proanthocyanidins. Ensiling reduced assayable proanthocyanidins by 29%, but the composition of the remaining proanthocyanidins in silage resembled that of the fresh plants.
Resumo:
The General Election for the 56th United Kingdom Parliament was held on 7 May 2015. Tweets related to UK politics, not only those with the specific hashtag ”#GE2015”, have been collected in the period between March 1 and May 31, 2015. The resulting dataset contains over 28 million tweets for a total of 118 GB in uncompressed format or 15 GB in compressed format. This study describes the method that was used to collect the tweets and presents some analysis, including a political sentiment index, and outlines interesting research directions on Big Social Data based on Twitter microblogging.
Resumo:
Among all the paradigms in economic theory, the theoretical predictions of oligopoly were the first to be examined in the laboratory. In this chapter, instead of surveying all the experiments with few sellers, we adopt a narrower definition of the term “oligopoly”, and focus on the experiments that were directly inspired by the basic oligopolistic models of Cournot, Bertrand, Hotelling, Stackelberg, and some extensions. Most of the experiments we consider in this chapter have been run in the last three decades. This literature can be considered as a new wave of experimental works aiming at representing basic oligopolistic markets and testing their properties. The chapter is divided into independent sections referring to different parts of the oligopolistic theory, including both monopoly as well as a number of extensions of the basic models, which have been chosen with the aim of providing a representative list of the relevant experimental findings.
Resumo:
Future extreme-scale high-performance computing systems will be required to work under frequent component failures. The MPI Forum's User Level Failure Mitigation proposal has introduced an operation, MPI_Comm_shrink, to synchronize the alive processes on the list of failed processes, so that applications can continue to execute even in the presence of failures by adopting algorithm-based fault tolerance techniques. This MPI_Comm_shrink operation requires a fault tolerant failure detection and consensus algorithm. This paper presents and compares two novel failure detection and consensus algorithms. The proposed algorithms are based on Gossip protocols and are inherently fault-tolerant and scalable. The proposed algorithms were implemented and tested using the Extreme-scale Simulator. The results show that in both algorithms the number of Gossip cycles to achieve global consensus scales logarithmically with system size. The second algorithm also shows better scalability in terms of memory and network bandwidth usage and a perfect synchronization in achieving global consensus.
An LDA and probability-based classifier for the diagnosis of Alzheimer's Disease from structural MRI
Resumo:
In this paper a custom classification algorithm based on linear discriminant analysis and probability-based weights is implemented and applied to the hippocampus measurements of structural magnetic resonance images from healthy subjects and Alzheimer’s Disease sufferers; and then attempts to diagnose them as accurately as possible. The classifier works by classifying each measurement of a hippocampal volume as healthy controlsized or Alzheimer’s Disease-sized, these new features are then weighted and used to classify the subject as a healthy control or suffering from Alzheimer’s Disease. The preliminary results obtained reach an accuracy of 85.8% and this is a similar accuracy to state-of-the-art methods such as a Naive Bayes classifier and a Support Vector Machine. An advantage of the method proposed in this paper over the aforementioned state of the art classifiers is the descriptive ability of the classifications it produces. The descriptive model can be of great help to aid a doctor in the diagnosis of Alzheimer’s Disease, or even further the understand of how Alzheimer’s Disease affects the hippocampus.
Resumo:
Epidemic protocols are a bio-inspired communication and computation paradigm for extreme-scale network system based on randomized communication. The protocols rely on a membership service to build decentralized and random overlay topologies. In a weakly connected overlay topology, a naive mechanism of membership protocols can break the connectivity, thus impairing the accuracy of the application. This work investigates the factors in membership protocols that cause the loss of global connectivity and introduces the first topology connectivity recovery mechanism. The mechanism is integrated into the Expander Membership Protocol, which is then evaluated against other membership protocols. The analysis shows that the proposed connectivity recovery mechanism is effective in preserving topology connectivity and also helps to improve the application performance in terms of convergence speed.
Resumo:
This work investigates the problem of feature selection in neuroimaging features from structural MRI brain images for the classification of subjects as healthy controls, suffering from Mild Cognitive Impairment or Alzheimer’s Disease. A Genetic Algorithm wrapper method for feature selection is adopted in conjunction with a Support Vector Machine classifier. In very large feature sets, feature selection is found to be redundant as the accuracy is often worsened when compared to an Support Vector Machine with no feature selection. However, when just the hippocampal subfields are used, feature selection shows a significant improvement of the classification accuracy. Three-class Support Vector Machines and two-class Support Vector Machines combined with weighted voting are also compared with the former and found more useful. The highest accuracy achieved at classifying the test data was 65.5% using a genetic algorithm for feature selection with a three-class Support Vector Machine classifier.
Resumo:
Medicanes or “Mediterranean hurricanes” represent a rare and physically unique type of Mediterranean mesoscale cyclone. There are similarities with tropical cyclones with regard to their development (based on the thermodynamical disequilibrium between the warm sea and the overlying troposphere) and their kinematic and thermodynamical properties (medicanes are intense vortices with a warm core and even a cloud-free eye). Although medicanes are smaller and their wind speeds are lower than in tropical cyclones, the severity of their winds can cause substantial damage to islands and coastal areas. Concern about how human-induced climate change will affect extreme events is increasing. This includes the future impacts on medicanes due to the warming of the Mediterranean waters and the projected changes in regional atmospheric circulation. However, most global climate models do not have high enough spatial resolution to adequately represent small features such as medicanes. In this study, a cyclone tracking algorithm is applied to high resolution global climate model data with a horizontal grid resolution of approximately 25 km over the Mediterranean region. After a validation of the climatology of general Mediterranean mesoscale cyclones, changes in medicanes are determined using climate model experiments with present and future forcing. The magnitude of the changes in the winds, frequency and location of medicanes is assessed. While no significant changes in the total number of Mediterranean mesoscale cyclones are found, medicanes tend to decrease in number but increase in intensity. The model simulation suggests that medicanes tend to form more frequently in the Gulf of Lion–Genoa and South of Sicily.
Resumo:
This paper aims at two different contributions to the literature on international environmental agreements. First, we model environmental agreements as a generic situation, characterized as a Hawk-Dove game with multiple asymmetric equilibria. Second, the article applies the theory on non-cooperative games with confirmed proposals, based on an alternating proposals bargaining protocol, as a way of overcoming the usual problems of coordination and bargaining failures in environmental agreement games, due to payoff asymmetry and equilibrium multiplicity.
Resumo:
This paper presents an integrative and spatially explicit modeling approach for analyzing human and environmental exposure from pesticide application of smallholders in the potato producing Andean region in Colombia. The modeling approach fulfills the following criteria: (i) it includes environmental and human compartments; (ii) it contains a behavioral decision-making model for estimating the effect of policies on pesticide flows to humans and the environment; (iii) it is spatially explicit; and (iv) it is modular and easily expandable to include additional modules, crops or technologies. The model was calibrated and validated for the Vereda La Hoya and was used to explore the effect of different policy measures in the region. The model has moderate data requirements and can be adapted relatively easy to other regions in developing countries with similar conditions.
Resumo:
The Transition Network exemplifies the potential of social movements to create spaces of possibility for alternatives to emerge in the interstices of mainstream, neoliberal economies. Yet, little work has been carried out so far on the Transition Network or other grassroots innovations for sustainability in a way that reveals their actual patterns of diffusion. This graphic of the diffusion of the Transition Network visualises its spatial structure and compare diffusion patterns across Italy, France, Great Britain and Germany. The graphics show that the number of transition initiatives in the four countries has steadily increased over the past eight years, but the rate of increase has slowed down in all countries. The maps clearly show that in all four countries the diffusion of the Transition Network has not been spatially even. The graphic suggests that in each country transition initiatives are more likely to emerge in some geographical areas (hotspots) than in others (cold spots). While the existence of a spatial structure of the Transition Network may result from the combination of place-specific factors and diffusion mechanisms, these graphics illustrate the importance of better comprehending where grassroots innovations emerge.
Resumo:
In order to gain insights into events and issues that may cause errors and outages in parts of IP networks, intelligent methods that capture and express causal relationships online (in real-time) are needed. Whereas generalised rule induction has been explored for non-streaming data applications, its application and adaptation on streaming data is mostly undeveloped or based on periodic and ad-hoc training with batch algorithms. Some association rule mining approaches for streaming data do exist, however, they can only express binary causal relationships. This paper presents the ongoing work on Online Generalised Rule Induction (OGRI) in order to create expressive and adaptive rule sets real-time that can be applied to a broad range of applications, including network telemetry data streams.
Resumo:
In this study the presence of periodontopathic pathogens in atheromatous plaques removed from coronary arteries of patients with chronic periodontitis and periodontally healthy subjects by PCR was detected. Our results indicate a significant association between the presence of Porphyromonas gingivalis and atheromas, and the periodontal bacteria in oral biofilm may find a way to reach arteries. (C) 2010 Elsevier Ltd. All rights reserved.