8 resultados para Prepackaged commodities, Checking of.

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditional software engineering approaches and metaphors fall short when applied to areas of growing relevance such as electronic commerce, enterprise resource planning, and mobile computing: such areas, in fact, generally call for open architectures that may evolve dynamically over time so as to accommodate new components and meet new requirements. This is probably one of the main reasons that the agent metaphor and the agent-oriented paradigm are gaining momentum in these areas. This thesis deals with the engineering of complex software systems in terms of the agent paradigm. This paradigm is based on the notions of agent and systems of interacting agents as fundamental abstractions for designing, developing and managing at runtime typically distributed software systems. However, today the engineer often works with technologies that do not support the abstractions used in the design of the systems. For this reason the research on methodologies becomes the basic point in the scientific activity. Currently most agent-oriented methodologies are supported by small teams of academic researchers, and as a result, most of them are in an early stage and still in the first context of mostly \academic" approaches for agent-oriented systems development. Moreover, such methodologies are not well documented and very often defined and presented only by focusing on specific aspects of the methodology. The role played by meta- models becomes fundamental for comparing and evaluating the methodologies. In fact a meta-model specifies the concepts, rules and relationships used to define methodologies. Although it is possible to describe a methodology without an explicit meta-model, formalising the underpinning ideas of the methodology in question is valuable when checking its consistency or planning extensions or modifications. A good meta-model must address all the different aspects of a methodology, i.e. the process to be followed, the work products to be generated and those responsible for making all this happen. In turn, specifying the work products that must be developed implies dening the basic modelling building blocks from which they are built. As a building block, the agent abstraction alone is not enough to fully model all the aspects related to multi-agent systems in a natural way. In particular, different perspectives exist on the role that environment plays within agent systems: however, it is clear at least that all non-agent elements of a multi-agent system are typically considered to be part of the multi-agent system environment. The key role of environment as a first-class abstraction in the engineering of multi-agent system is today generally acknowledged in the multi-agent system community, so environment should be explicitly accounted for in the engineering of multi-agent system, working as a new design dimension for agent-oriented methodologies. At least two main ingredients shape the environment: environment abstractions - entities of the environment encapsulating some functions -, and topology abstractions - entities of environment that represent the (either logical or physical) spatial structure. In addition, the engineering of non-trivial multi-agent systems requires principles and mechanisms for supporting the management of the system representation complexity. These principles lead to the adoption of a multi-layered description, which could be used by designers to provide different levels of abstraction over multi-agent systems. The research in these fields has lead to the formulation of a new version of the SODA methodology where environment abstractions and layering principles are exploited for en- gineering multi-agent systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Maintaining the postharvest quality of whole and fresh-cut fruit during storage and distribution is the major challenge facing fruit industry. For this purpose, industry adopt a wide range of technologies to enable extended shelf-life. Many factors can lead to loss of quality in fresh product, hence the common description of these products as ‘perishable’. As a consequence normal factors such as transpiration and respiration lead ultimately to water loss and senescence of the product. Fruits and vegetables are living commodities and their rate of respiration is of key importance to maintenance of quality. It has been commonly observed that the greater the respiration rate of a product, the shorter the shelf-life. The principal problem for fresh-cut fruit industries is the relative shorter shelf-life of minimally processed fruit (MPF) compared to intact product. This fact is strictly connected with the higher ethylene production of fruit tissue stimulated during fresh-cut processing (peeling, cutting, dipping). 1-Methylcyclopropene (1-MCP) is an inhibitor of ethylene action and several researches have shown its effectiveness on the inhibition of ripening and senescence incidence for intact fruit and consequently on their shelf-life extension. More recently 1-MCP treatment has been tested also for shelf-life extension of MPF but discordant results have been obtained. Considering that in some countries 1-MCP is already a commercial product registered for the use on a number of horticultural products, the main aim of this actual study was to enhance our understanding on the effects of 1-MCP treatment on the quality maintenance of whole and fresh-cut climacteric and non-climacteric fruit (apple, kiwifruit and pineapple). Concerning the effects of 1-MCP on whole fruit, was investigated the effects of a semi-commercial postharvest treatment with 1-MCP on the quality of Pink Lady apples as functions of fruit ripening stage, 1-MCP dose, storage time and also in combination with controlled atmospheres storage in order to better understand what is the relationship among these parameters and if is possible to maximize the 1-MCP treatment to meet the market/consumer needs and then in order to put in the market excellent fruit. To achieve this purpose an incomplete three-level three-factor design was adopted. During the storage were monitored several quality parameters: firmness, ripening index, ethylene and carbon dioxide production and were also performed a sensory evaluations after 6 month of storage. In this study the higher retention of firmness (at the end of storage) was achieved by applying the greatest 1-MCP concentration to fruits with the lowest maturity stage. This finding means that in these semi-commercial conditions we may considerate completely blocked the fruit softening. 1-MCP was able to delay also the ethylene and CO2 production and the maturity parameters (soluble solids content and total acidity). Only in some cases 1-MCP generate a synergistic effect with the CA storage. The results of sensory analyses indicated that, the 1-MCP treatment did not affect the sweetness and whole fruit flavour while had a little effect on the decreasing cut fruit flavour. On the contrary the treated apple was more sour, crisp, firm and juicy. The effects of some treatment (dipping and MAP) on the nutrient stability were also investigated showing that in this case study the adopted treatments did not have drastic effects on the antioxidant compounds on the contrary the dipping may enhance the total antioxidant activity by the accumulation of ascorbic acid on the apple cut surface. Results concerning the effects of 1-MCP in combination with MAP on the quality parameters behaviour of the kiwifruit were not always consistent and clear: in terms of colour maintenance, it seemed to have a synergistic effect with N2O MAP; as far as ripening index is concerned, 1-MCP had a preservative effect, but just for sample packed in air.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research is part of a survey for the detection of the hydraulic and geotechnical conditions of river embankments funded by the Reno River Basin Regional Technical Service of the Region Emilia-Romagna. The hydraulic safety of the Reno River, one of the main rivers in North-Eastern Italy, is indeed of primary importance to the Emilia-Romagna regional administration. The large longitudinal extent of the banks (several hundreds of kilometres) has placed great interest in non-destructive geophysical methods, which, compared to other methods such as drilling, allow for the faster and often less expensive acquisition of high-resolution data. The present work aims to experience the Ground Penetrating Radar (GPR) for the detection of local non-homogeneities (mainly stratigraphic contacts, cavities and conduits) inside the Reno River and its tributaries embankments, taking into account supplementary data collected with traditional destructive tests (boreholes, cone penetration tests etc.). A comparison with non-destructive methodologies likewise electric resistivity tomography (ERT), Multi-channels Analysis of Surface Waves (MASW), FDEM induction, was also carried out in order to verify the usability of GPR and to provide integration of various geophysical methods in the process of regular maintenance and check of the embankments condition. The first part of this thesis is dedicated to the explanation of the state of art concerning the geographic, geomorphologic and geotechnical characteristics of Reno River and its tributaries embankments, as well as the description of some geophysical applications provided on embankments belonging to European and North-American Rivers, which were used as bibliographic basis for this thesis realisation. The second part is an overview of the geophysical methods that were employed for this research, (with a particular attention to the GPR), reporting also their theoretical basis and a deepening of some techniques of the geophysical data analysis and representation, when applied to river embankments. The successive chapters, following the main scope of this research that is to highlight advantages and drawbacks in the use of Ground Penetrating Radar applied to Reno River and its tributaries embankments, show the results obtained analyzing different cases that could yield the formation of weakness zones, which successively lead to the embankment failure. As advantages, a considerable velocity of acquisition and a spatial resolution of the obtained data, incomparable with respect to other methodologies, were recorded. With regard to the drawbacks, some factors, related to the attenuation losses of wave propagation, due to different content in clay, silt, and sand, as well as surface effects have significantly limited the correlation between GPR profiles and geotechnical information and therefore compromised the embankment safety assessment. Recapitulating, the Ground Penetrating Radar could represent a suitable tool for checking up river dike conditions, but its use has significantly limited by geometric and geotechnical characteristics of the Reno River and its tributaries levees. As a matter of facts, only the shallower part of the embankment was investigate, achieving also information just related to changes in electrical properties, without any numerical measurement. Furthermore, GPR application is ineffective for a preliminary assessment of embankment safety conditions, while for detailed campaigns at shallow depth, which aims to achieve immediate results with optimal precision, its usage is totally recommended. The cases where multidisciplinary approach was tested, reveal an optimal interconnection of the various geophysical methodologies employed, producing qualitative results concerning the preliminary phase (FDEM), assuring quantitative and high confidential description of the subsoil (ERT) and finally, providing fast and highly detailed analysis (GPR). Trying to furnish some recommendations for future researches, the simultaneous exploitation of many geophysical devices to assess safety conditions of river embankments is absolutely suggested, especially to face reliable flood event, when the entire extension of the embankments themselves must be investigated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis deals with an investigation of Decomposition and Reformulation to solve Integer Linear Programming Problems. This method is often a very successful approach computationally, producing high-quality solutions for well-structured combinatorial optimization problems like vehicle routing, cutting stock, p-median and generalized assignment . However, until now the method has always been tailored to the specific problem under investigation. The principal innovation of this thesis is to develop a new framework able to apply this concept to a generic MIP problem. The new approach is thus capable of auto-decomposition and autoreformulation of the input problem applicable as a resolving black box algorithm and works as a complement and alternative to the normal resolving techniques. The idea of Decomposing and Reformulating (usually called in literature Dantzig and Wolfe Decomposition DWD) is, given a MIP, to convexify one (or more) subset(s) of constraints (slaves) and working on the partially convexified polyhedron(s) obtained. For a given MIP several decompositions can be defined depending from what sets of constraints we want to convexify. In this thesis we mainly reformulate MIPs using two sets of variables: the original variables and the extended variables (representing the exponential extreme points). The master constraints consist of the original constraints not included in any slaves plus the convexity constraint(s) and the linking constraints(ensuring that each original variable can be viewed as linear combination of extreme points of the slaves). The solution procedure consists of iteratively solving the reformulated MIP (master) and checking (pricing) if a variable of reduced costs exists, and in which case adding it to the master and solving it again (columns generation), or otherwise stopping the procedure. The advantage of using DWD is that the reformulated relaxation gives bounds stronger than the original LP relaxation, in addition it can be incorporated in a Branch and bound scheme (Branch and Price) in order to solve the problem to optimality. If the computational time for the pricing problem is reasonable this leads in practice to a stronger speed up in the solution time, specially when the convex hull of the slaves is easy to compute, usually because of its special structure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The years of excessive use of thiabendazole to control Penicillium expansum has induced the development of resistance. Sensitivity of fourty eight strains collected from orchards and packinghouses in Emilia Romagna to pure and commercial TBZ was determined in vitro on TBZ amended medium (400μg/mL). Out of 48 strains, 35 were thiabendazole-sensitive (S) and 13 were thiabendazole-resistant (R). Microtiter assay adapted to P. expansum, showed EC50 values ranging from 54 to 320 μg/mL for ten TBZ-resistant strains. At the highest dose (50 μg/mL), resistant strains growth was not inhibited and the reported MICs value were >1000 μg/mL. Therefore, preliminary screening combined with microtiter assay, can be a good strategy to test susceptibility to TBZ. Mutations in the β-tubulin gene were studied on amino acid sequences from residue 167 to residue 357 of 10 P. expansum strains. Mutation at codon 198 was associated with TBZ-resistance. However, its absence in 3 resistant strains can be explained by the involvement of other mechanisms. Moreover, a P. expansum strain LB8/99 showed good antifungal effect against some fungal pathogens through double petri dish assay. It inhibited both mycelium growth and conidia germination of B. cinerea, C. acutatum, and M. laxa, and reduced significantly by 53% and 18% respectively P. expansum. Three major VOCS: geosmin, phenethyl alcolhol (PEA) and an unknown substance were identified by GC-MS analysis. Consistent fumigation of fungal pathogens with PEA (1230 mg/mL), inhibited both conidia germination and mycelium growth of all pathogens, except conidia germination of P. expansum that was reduced by 90% with respect to control. While, the concentration of PEA produced naturally by LB8/99 was ineffective in controlling the pathogens and seemed to have a synergic or additive effect with the other VOCS. Investigations to study the biofumigant effect of LB8/99 on other commodities like seeds and seedlings are in progress.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The central topic of this thesis is the study of algorithms for type checking, both from the programming language and from the proof-theoretic point of view. A type checking algorithm takes a program or a proof, represented as a syntactical object, and checks its validity with respect to a specification or a statement. It is a central piece of compilers and proof assistants. We postulate that since type checkers are at the interface between proof theory and program theory, their study can let these two fields mutually enrich each other. We argue by two main instances: first, starting from the problem of proof reuse, we develop an incremental type checker; secondly, starting from a type checking program, we evidence a novel correspondence between natural deduction and the sequent calculus.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increase in the efficiency of agricultural machinery is a theme that attracted the attention and investments of the industrial and research community. In addition, in a global market, where the prices of agricultural commodities are so volatile and the prices of the inputs increase, farmers and agricultural contractors struggle to obtain at the end of the agricultural season a consolidated profit. For these reasons, it is important to carefully plan the usage of combine harvesters, to reduce the unproductive time and the input usage such as the fuel, that at the end of the harvesting season could increase costs. This study aims to develop an algorithm able to automatically identify and evaluate the time spent by the combines in each of the identified activities, identify the field boundaries of the harvested fields and perform a performance evaluation. To be able to develop the algorithm, during the harvesting seasons of 2020 and 2022, two combine harvesters operating in real-world conditions in Bologna’s Province were monitored. The data necessary to perform the analysis were acquired as CANBUS data and processed by using the MATLAB ® suite. The results obtained from this analysis show that the monitored combines have spent over 60% of the time performing harvesting activities, 13% of the time idling at the field, 10% performing headland turn, the 3% and 4% of the time respectively in transport on the field and road and 2% of the time in unloading. In addition, the performance of the monitored combines resulted similarly to the performance reported in other studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The notion of commodification is a fascinating one. It entails many facets, ranging from subjective debates on desirability of commodification to in depth economic analyses of objects of value and their corresponding markets. Commodity theory is therefore not just defined by a single debate, but spans a plethora of different discussions. This thesis maps and situates those theories and debates and selects one specific strain to investigate further. This thesis argues that commodity theory in its optima forma deals with the investigation into what sets commodities apart from non-commodities. It proceeds to examine the many given answers to this question by scholars ranging from the mid 1800’s to the late 2000’s. Ultimately, commodification is defined as a process in which an object becomes an element of the total wealth of societies in which the capitalist mode of production prevails. In doing so, objects must meet observables, or indicia, of commodification provided by commodity theories. Problems arise when objects are clearly part of the total wealth in societies without meeting established commodity indicia. In such cases, objects are part of the total wealth of a society without counting as a commodity. This thesis examines this phenomenon in relation to the novel commodities of audiences and data. It explains how these non-commodities (according to classical theories) are still essential elements of industry. The thesis then takes a deep dive into commodity theory using the theory on the construction of social reality by John Searle.