23 resultados para Strict Convexity


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Analysts, politicians and international players from all over the world look at China as one of the most powerful countries on the international scenario, and as a country whose economic development can significantly impact on the economies of the rest of the world. However many aspects of this country have still to be investigated. First the still fundamental role played by Chinese rural areas for the general development of the country from a political, economic and social point of view. In particular, the way in which the rural areas have influenced the social stability of the whole country has been widely discussed due to their strict relationship with the urban areas where most people from the countryside emigrate searching for a job and a better life. In recent years many studies have mostly focused on the urbanization phenomenon with little interest in the living conditions in rural areas and in the deep changes which have occurred in some, mainly agricultural provinces. An analysis of the level of infrastructure is one of the main aspects which highlights the principal differences in terms of living conditions between rural and urban areas. In this thesis, I first carried out the analysis through the multivariate statistics approach (Principal Component Analysis and Cluster Analysis) in order to define the new map of rural areas based on the analysis of living conditions. In the second part I elaborated an index (Living Conditions Index) through the Fuzzy Expert/Inference System. Finally I compared this index (LCI) to the results obtained from the cluster analysis drawing geographic maps. The data source is the second national agricultural census of China carried out in 2006. In particular, I analysed the data refer to villages but aggregated at province level.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La tesi intende offrire una disamina approfondita dell’istituto del giudicato implicito, di rito e di merito, sotto il duplice profilo dei recenti orientamenti della giurisprudenza di legittimità, da un lato, e dei rilievi critici della dottrina, dall’altro. Il candidato si sofferma, preliminarmente, sulla ratio delle recenti sentenze delle sezioni unite della Cassazione, le quali promuovono un’interpretazione restrittiva e residuale dell’art. 37 c.p.c. alla luce del principio costituzionale della ragionevole durata. Si pone, quindi, il problema del rapporto tra il giudicato implicito sulla giurisdizione e i principi processuali costituzionali, dedicando ampio spazio e rilievo alle riflessioni critiche della dottrina sulla teoria del giudicato implicito. Il candidato passa così all’esame del giudicato implicito sulle questioni preliminari di merito, dopo aver trattato il tema dell’ordine logico-giuridico delle questioni e della struttura della decisione. Nel corso di questa analisi, ravvisa nel principio della ragione più liquida la negazione dell’idea di giudicato implicito, sviluppando così alcune riflessioni critiche sul giudicato di merito implicito. A questo punto, il piano dell’indagine si incentra su un aspetto specifico dell’istituto in esame, particolarmente importante sotto il profilo dei risvolti applicativi: le implicazioni del giudicato implicito in sede di impugnazione. Segue, quindi, la parte conclusiva della tesi dedicata ai profili di criticità del recente orientamento delle sezioni unite, con particolare riguardo all’onere di appello incidentale della parte vittoriosa nel merito. La tesi mette in luce come la struttura degli istituti venga in qualche misura piegata ad esigenze di deflazione, che andrebbero però perseguite con altri e più coerenti strumenti.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis presents several data processing and compression techniques capable of addressing the strict requirements of wireless sensor networks. After introducing a general overview of sensor networks, the energy problem is introduced, dividing the different energy reduction approaches according to the different subsystem they try to optimize. To manage the complexity brought by these techniques, a quick overview of the most common middlewares for WSNs is given, describing in detail SPINE2, a framework for data processing in the node environment. The focus is then shifted on the in-network aggregation techniques, used to reduce data sent by the network nodes trying to prolong the network lifetime as long as possible. Among the several techniques, the most promising approach is the Compressive Sensing (CS). To investigate this technique, a practical implementation of the algorithm is compared against a simpler aggregation scheme, deriving a mixed algorithm able to successfully reduce the power consumption. The analysis moves from compression implemented on single nodes to CS for signal ensembles, trying to exploit the correlations among sensors and nodes to improve compression and reconstruction quality. The two main techniques for signal ensembles, Distributed CS (DCS) and Kronecker CS (KCS), are introduced and compared against a common set of data gathered by real deployments. The best trade-off between reconstruction quality and power consumption is then investigated. The usage of CS is also addressed when the signal of interest is sampled at a Sub-Nyquist rate, evaluating the reconstruction performance. Finally the group sparsity CS (GS-CS) is compared to another well-known technique for reconstruction of signals from an highly sub-sampled version. These two frameworks are compared again against a real data-set and an insightful analysis of the trade-off between reconstruction quality and lifetime is given.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The wide diffusion of cheap, small, and portable sensors integrated in an unprecedented large variety of devices and the availability of almost ubiquitous Internet connectivity make it possible to collect an unprecedented amount of real time information about the environment we live in. These data streams, if properly and timely analyzed, can be exploited to build new intelligent and pervasive services that have the potential of improving people's quality of life in a variety of cross concerning domains such as entertainment, health-care, or energy management. The large heterogeneity of application domains, however, calls for a middleware-level infrastructure that can effectively support their different quality requirements. In this thesis we study the challenges related to the provisioning of differentiated quality-of-service (QoS) during the processing of data streams produced in pervasive environments. We analyze the trade-offs between guaranteed quality, cost, and scalability in streams distribution and processing by surveying existing state-of-the-art solutions and identifying and exploring their weaknesses. We propose an original model for QoS-centric distributed stream processing in data centers and we present Quasit, its prototype implementation offering a scalable and extensible platform that can be used by researchers to implement and validate novel QoS-enforcement mechanisms. To support our study, we also explore an original class of weaker quality guarantees that can reduce costs when application semantics do not require strict quality enforcement. We validate the effectiveness of this idea in a practical use-case scenario that investigates partial fault-tolerance policies in stream processing by performing a large experimental study on the prototype of our novel LAAR dynamic replication technique. Our modeling, prototyping, and experimental work demonstrates that, by providing data distribution and processing middleware with application-level knowledge of the different quality requirements associated to different pervasive data flows, it is possible to improve system scalability while reducing costs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the era of the Internet of Everything, a user with a handheld or wearable device equipped with sensing capability has become a producer as well as a consumer of information and services. The more powerful these devices get, the more likely it is that they will generate and share content locally, leading to the presence of distributed information sources and the diminishing role of centralized servers. As of current practice, we rely on infrastructure acting as an intermediary, providing access to the data. However, infrastructure-based connectivity might not always be available or the best alternative. Moreover, it is often the case where the data and the processes acting upon them are of local scopus. Answers to a query about a nearby object, an information source, a process, an experience, an ability, etc. could be answered locally without reliance on infrastructure-based platforms. The data might have temporal validity limited to or bounded to a geographical area and/or the social context where the user is immersed in. In this envisioned scenario users could interact locally without the need for a central authority, hence, the claim of an infrastructure-less, provider-less platform. The data is owned by the users and consulted locally as opposed to the current approach of making them available globally and stay on forever. From a technical viewpoint, this network resembles a Delay/Disruption Tolerant Network where consumers and producers might be spatially and temporally decoupled exchanging information with each other in an adhoc fashion. To this end, we propose some novel data gathering and dissemination strategies for use in urban-wide environments which do not rely on strict infrastructure mediation. While preserving the general aspects of our study and without loss of generality, we focus our attention toward practical applicative scenarios which help us capture the characteristics of opportunistic communication networks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Market manipulation is an illegal practice that enables a person can profit from practices that artificially raise or lower the prices of an instrument in the financial markets. Its prohibition is based on the 2003 Market Abuse Directive in the EU. The current market manipulation regime was broadly considered as a big success except for enforcement and supervisory inconsistencies in the Member States at the initial. A review of the market manipulation regime began at the end of 2007, which became quickly incorporated into the wider EU crisis-era reform program. A number of weaknesses of current regime have been identified, which include regulatory gaps caused by the development of trading venues and financial products, regulatory gaps concerning cross-border and cross-markets manipulation (particular commodity markets), legal uncertainty as a result of various implementation, and inefficient supervision and enforcement. On 12 June 2014, a new regulatory package of market abuse, Market Abuse Regulation and Directive on criminal sanctions for market abuse, has been adopted. And several changes will be made concerning the EU market manipulation regime. A wider scope of the regime and a new prohibition of attempted market manipulation will ensure the prevention of market manipulation at large. The AMPs will be subject to strict scrutiny of ESMA to reduce divergences in implementation. In order to enhance efficiency of supervision and enforcement, powers of national competent authorities will be strengthened, ESMA is imposed more power to settle disagreement between national regulators, and the administrative and criminal sanctioning regimes are both further harmonized. In addition, the protection of fundamental rights is stressed by the new market manipulation regime, and some measures are provided to guarantee its realization. Further, the success EU market manipulation regime could be of significant reference to China, helping China to refine its immature regime.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work presents results from experimental investigations of several different atmospheric pressure plasmas applications, such as Metal Inert Gas (MIG) welding and Plasma Arc Cutting (PAC) and Welding (PAW) sources, as well as Inductively Coupled Plasma (ICP) torches. The main diagnostic tool that has been used is High Speed Imaging (HSI), often assisted by Schlieren imaging to analyse non-visible phenomena. Furthermore, starting from thermo-fluid-dynamic models developed by the University of Bologna group, such plasma processes have been studied also with new advanced models, focusing for instance on the interaction between a melting metal wire and a plasma, or considering non-equilibrium phenomena for diagnostics of plasma arcs. Additionally, the experimental diagnostic tools that have been developed for industrial thermal plasmas have been used also for the characterization of innovative low temperature atmospheric pressure non equilibrium plasmas, such as dielectric barrier discharges (DBD) and Plasma Jets. These sources are controlled by few kV voltage pulses with pulse rise time of few nanoseconds to avoid the formation of a plasma arc, with interesting applications in surface functionalization of thermosensitive materials. In order to investigate also bio-medical applications of thermal plasma, a self-developed quenching device has been connected to an ICP torch. Such device has allowed inactivation of several kinds of bacteria spread on petri dishes, by keeping the substrate temperature lower than 40 degrees, which is a strict requirement in order to allow the treatment of living tissues.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aims of this research study is to explore the opportunity to set up Performance Objectives (POs) parameters for specific risks in RTE products to propose for food industries and food authorities. In fact, even if microbiological criteria for Salmonella and Listeria monocytogenes Ready-to-Eat (RTE) products are included in the European Regulation, these parameters are not risk based and no microbiological criteria for Bacillus cereus in RTE products is present. For these reasons the behaviour of Salmonella enterica in RTE mixed salad, the microbiological characteristics in RTE spelt salad, and the definition of POs for Bacillus cereus and Listeria monocytogenes in RTE spelt salad has been assessed. Based on the data produced can be drawn the following conclusions: 1. A rapid growth of Salmonella enterica may occurr in mixed ingredient salads, and strict temperature control during the production chain of the product is critical. 2. Spelt salad is characterized by the presence of high number of Lactic Acid Bacteria. Listeria spp. and Enterobacteriaceae, on the contrary, did not grow during the shlef life, probably due to the relevant metabolic activity of LAB. 3. The use of spelt and cheese compliant with the suggested POs might significantly reduce the incidence of foodborne intoxications due to Bacillus cereus and Listeria monocytogenes and the proportions of recalls, causing huge economic losses for food companies commercializing RTE products. 4. The approach to calculate the POs values and reported in my work can be easily adapted to different food/risk combination as well as to any changes in the formulation of the same food products. 5. The optimized sampling plans in term of number of samples to collect can be derive in order to verify the compliance to POs values selected.