930 resultados para Integrated farming systems


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coffea sp. is cultivated in large areas, using both conventional and organic management. However, information about the sustainability of these two management systems is still deficient. The objective of the present study was to evaluate the physical properties of soil cultivated with Conilon coffee (C. canephora) under organic and conventional management. Two areas cultivated with Conilon coffee (under organic and conventional management) and a fragment of Atlantic forest, used as a reference, were selected for the experiment. Soil granulometry, hydraulic conductivity, water retention curve, resistance to penetration, porosity, optimal hydric interval, and other physical characteristics were measured at depths of 0 to 10 and 10 to 20 cm. The data was submitted to multivariate and descriptive statistical analyses. Higher similarity was observed between the soil cultivated with Conilon coffee under organic management and the Atlantic forest soil. Soil resistance to penetration at 10, 30, 100, 500 and 1500 kPa, macro porosity, density and total porosity were the main physical properties that differentiated both management systems studied. The non-use of agricultural machinery and the addition of organic matter may be the main reasons for higher soil sustainability observed under organic management when compared with the conventional system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Juruá valley mesoregion is recognized for its diversity of cultivars of common beans and cowpea and is an important center for on farm bean conservation in Brazil. However, there is little information about production systems of Creoles cultivars and, in this approach, the study aimed to identify the production centers and to gather information about beans production systems. Thirty eight farmers and five merchants were interviewed using semi-structured questionnaries. Juruá valley farmers use three beans production systems: "beach farming", "slash burn system" and "stuffy farming". The systems use family labor with low dependence on external inputs, two classified as itinerant. The study identified two beans production centers: Alto Juruá extractive reserve and Santa Luzia directed settlement project.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Internet of Things systems are pervasive systems evolved from cyber-physical to large-scale systems. Due to the number of technologies involved, software development involves several integration challenges. Among them, the ones preventing proper integration are those related to the system heterogeneity, and thus addressing interoperability issues. From a software engineering perspective, developers mostly experience the lack of interoperability in the two phases of software development: programming and deployment. On the one hand, modern software tends to be distributed in several components, each adopting its most-appropriate technology stack, pushing programmers to code in a protocol- and data-agnostic way. On the other hand, each software component should run in the most appropriate execution environment and, as a result, system architects strive to automate the deployment in distributed infrastructures. This dissertation aims to improve the development process by introducing proper tools to handle certain aspects of the system heterogeneity. Our effort focuses on three of these aspects and, for each one of those, we propose a tool addressing the underlying challenge. The first tool aims to handle heterogeneity at the transport and application protocol level, the second to manage different data formats, while the third to obtain optimal deployment. To realize the tools, we adopted a linguistic approach, i.e.\ we provided specific linguistic abstractions that help developers to increase the expressive power of the programming language they use, writing better solutions in more straightforward ways. To validate the approach, we implemented use cases to show that the tools can be used in practice and that they help to achieve the expected level of interoperability. In conclusion, to move a step towards the realization of an integrated Internet of Things ecosystem, we target programmers and architects and propose them to use the presented tools to ease the software development process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Smart Farming Technologies (SFT) is a term used to define the set of digital technologies able not only to control and manage the farm system, but also to connect it to the many disruptive digital applications posed at multiple links along the value chain. The adoption of SFT has been so far limited, with significant differences at country-levels and among different types of farms and farmers. The objective of this thesis is to analyze what factors contributes to shape the agricultural digital transition and to assess its potential impacts in the Italian agri-food system. Specifically, this overall research objective is approached under three different perspectives. Firstly, we carry out a review of the literature that focuses on the determinants of adoption of farm-level Management Information Systems (MIS), namely the most adopted smart farming solutions in Italy. Secondly, we run an empirical analysis on what factors are currently shaping the adoption of SFT in Italy. In doing so, we focus on the multi-process and multi-faceted aspects of the adoption, by overcoming the one-off binary approach often used to study adoption decisions. Finally, we adopt a forward-looking perspective to investigate what the socio-ethical implications of a diffused use of SFT might be. On the one hand, our results indicate that bigger, more structured farms with higher levels of commercial integration along the agri-food supply chain are those more likely to be early adopters. On the other hand, they highlight the need for the institutional and organizational environment around farms to more effectively support farmers in the digital transition. Moreover, the role of several other actors and actions are discussed and analyzed, by highlighting the key role of specific agri-food stakeholders and ad-hoc policies, with the aim to propose a clearer path towards an efficient, fair and inclusive digitalization of the agrifood sector.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The fourth industrial revolution, also known as Industry 4.0, has rapidly gained traction in businesses across Europe and the world, becoming a central theme in small, medium, and large enterprises alike. This new paradigm shifts the focus from locally-based and barely automated firms to a globally interconnected industrial sector, stimulating economic growth and productivity, and supporting the upskilling and reskilling of employees. However, despite the maturity and scalability of information and cloud technologies, the support systems already present in the machine field are often outdated and lack the necessary security, access control, and advanced communication capabilities. This dissertation proposes architectures and technologies designed to bridge the gap between Operational and Information Technology, in a manner that is non-disruptive, efficient, and scalable. The proposal presents cloud-enabled data-gathering architectures that make use of the newest IT and networking technologies to achieve the desired quality of service and non-functional properties. By harnessing industrial and business data, processes can be optimized even before product sale, while the integrated environment enhances data exchange for post-sale support. The architectures have been tested and have shown encouraging performance results, providing a promising solution for companies looking to embrace Industry 4.0, enhance their operational capabilities, and prepare themselves for the upcoming fifth human-centric revolution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis investigates mechanisms and boundary conditions that steer the early localisation of deformation and strain in carbonate multilayers involved in thrust systems, under shallow and mid-crustal conditions. Much is already understood about deformation localisation, but some key points remain loosely constrained. They encompass i) the understanding of which structural domains can preserve evidence of early stages of tectonic shortening, ii) the recognition of which mechanisms assist deformation during these stages and iii) the identification of parameters that actually steer the beginning of localisation. To clarify these points, the thesis presents the results of an integrated, multiscale and multi-technique structural study that relied on field and laboratory data to analyse the structural, architectural, mineralogical and geochemical features that govern deformation during compressional tectonics. By focusing on two case studies, the Eastern Southern Alps (northern Italy), where deformation is mainly brittle, and the Oman Mountains (northeastern Oman), where ductile deformation dominates, the thesis shows that the deformation localisation is steered by several mechanisms that mutually interact at different stages during compression. At shallow crustal conditions, derived conceptual and numerical models show that both inherited (e.g., stratigraphic) and acquired (e.g., structural) features play a key role in steering deformation and differentiating the seismic behaviour of the multilayer succession. At the same time, at deeper crustal conditions, strain localises in narrow domains in which fluids, temperature, shear strain and pressure act together during the development of the internal fabric and the chemical composition of mylonitic shear zones, in which localisation took place under high-pressure (HP) and low-temperature (LT) conditions. In particular, results indicate that those shear zones acted as “sheltering structural capsules” in which peculiar processes can happen and where the results of these processes can be successively preserved even over hundreds of millions of years.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Water Distribution Networks (WDNs) play a vital importance rule in communities, ensuring well-being band supporting economic growth and productivity. The need for greater investment requires design choices will impact on the efficiency of management in the coming decades. This thesis proposes an algorithmic approach to address two related problems:(i) identify the fundamental asset of large WDNs in terms of main infrastructure;(ii) sectorize large WDNs into isolated sectors in order to respect the minimum service to be guaranteed to users. Two methodologies have been developed to meet these objectives and subsequently they were integrated to guarantee an overall process which allows to optimize the sectorized configuration of WDN taking into account the needs to integrated in a global vision the two problems (i) and (ii). With regards to the problem (i), the methodology developed introduces the concept of primary network to give an answer with a dual approach, of connecting main nodes of WDN in terms of hydraulic infrastructures (reservoirs, tanks, pumps stations) and identifying hypothetical paths with the minimal energy losses. This primary network thus identified can be used as an initial basis to design the sectors. The sectorization problem (ii) has been faced using optimization techniques by the development of a new dedicated Tabu Search algorithm able to deal with real case studies of WDNs. For this reason, three new large WDNs models have been developed in order to test the capabilities of the algorithm on different and complex real cases. The developed methodology also allows to automatically identify the deficient parts of the primary network and dynamically includes new edges in order to support a sectorized configuration of the WDN. The application of the overall algorithm to the new real case studies and to others from literature has given applicable solutions even in specific complex situations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Protected crop production is a modern and innovative approach to cultivating plants in a controlled environment to optimize growth, yield, and quality. This method involves using structures such as greenhouses or tunnels to create a sheltered environment. These productive solutions are characterized by a careful regulation of variables like temperature, humidity, light, and ventilation, which collectively contribute to creating an optimal microclimate for plant growth. Heating, cooling, and ventilation systems are used to maintain optimal conditions for plant growth, regardless of external weather fluctuations. Protected crop production plays a crucial role in addressing challenges posed by climate variability, population growth, and food security. Similarly, animal husbandry involves providing adequate nutrition, housing, medical care and environmental conditions to ensure animal welfare. Then, sustainability is a critical consideration in all forms of agriculture, including protected crop and animal production. Sustainability in animal production refers to the practice of producing animal products in a way that minimizes negative impacts on the environment, promotes animal welfare, and ensures the long-term viability of the industry. Then, the research activities performed during the PhD can be inserted exactly in the field of Precision Agriculture and Livestock farming. Here the focus is on the computational fluid dynamic (CFD) approach and environmental assessment applied to improve yield, resource efficiency, environmental sustainability, and cost savings. It represents a significant shift from traditional farming methods to a more technology-driven, data-driven, and environmentally conscious approach to crop and animal production. On one side, CFD is powerful and precise techniques of computer modeling and simulation of airflows and thermo-hygrometric parameters, that has been applied to optimize the growth environment of crops and the efficiency of ventilation in pig barns. On the other side, the sustainability aspect has been investigated and researched in terms of Life Cycle Assessment analyses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The integration of distributed and ubiquitous intelligence has emerged over the last years as the mainspring of transformative advancements in mobile radio networks. As we approach the era of “mobile for intelligence”, next-generation wireless networks are poised to undergo significant and profound changes. Notably, the overarching challenge that lies ahead is the development and implementation of integrated communication and learning mechanisms that will enable the realization of autonomous mobile radio networks. The ultimate pursuit of eliminating human-in-the-loop constitutes an ambitious challenge, necessitating a meticulous delineation of the fundamental characteristics that artificial intelligence (AI) should possess to effectively achieve this objective. This challenge represents a paradigm shift in the design, deployment, and operation of wireless networks, where conventional, static configurations give way to dynamic, adaptive, and AI-native systems capable of self-optimization, self-sustainment, and learning. This thesis aims to provide a comprehensive exploration of the fundamental principles and practical approaches required to create autonomous mobile radio networks that seamlessly integrate communication and learning components. The first chapter of this thesis introduces the notion of Predictive Quality of Service (PQoS) and adaptive optimization and expands upon the challenge to achieve adaptable, reliable, and robust network performance in dynamic and ever-changing environments. The subsequent chapter delves into the revolutionary role of generative AI in shaping next-generation autonomous networks. This chapter emphasizes achieving trustworthy uncertainty-aware generation processes with the use of approximate Bayesian methods and aims to show how generative AI can improve generalization while reducing data communication costs. Finally, the thesis embarks on the topic of distributed learning over wireless networks. Distributed learning and its declinations, including multi-agent reinforcement learning systems and federated learning, have the potential to meet the scalability demands of modern data-driven applications, enabling efficient and collaborative model training across dynamic scenarios while ensuring data privacy and reducing communication overhead.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In highly urbanized coastal lowlands, effective site characterization is crucial for assessing seismic risk. It requires a comprehensive stratigraphic analysis of the shallow subsurface, coupled with the precise assessment of the geophysical properties of buried deposits. In this context, late Quaternary paleovalley systems, shallowly buried fluvial incisions formed during the Late Pleistocene sea-level fall and filled during the Holocene sea-level rise, are crucial for understanding seismic amplification due to their soft sediment infill and sharp lithologic contrasts. In this research, we conducted high-resolution stratigraphic analyses of two regions, the Pescara and Manfredonia areas along the Adriatic coastline of Italy, to delineate the geometries and facies architecture of two paleovalley systems. Furthermore, we carried out geophysical investigations to characterize the study areas and perform seismic response analyses. We tested the microtremor-based horizontal-to-vertical spectral ratio as a mapping tool to reconstruct the buried paleovalley geometries. We evaluated the relationship between geological and geophysical data and identified the stratigraphic surfaces responsible for the observed resonances. To perform seismic response analysis of the Pescara paleovalley system, we integrated the stratigraphic framework with microtremor and shear wave velocity measurements. The seismic response analysis highlights strong seismic amplifications in frequency ranges that can interact with a wide variety of building types. Additionally, we explored the applicability of artificial intelligence in performing facies analysis from borehole images. We used a robust dataset of high-resolution digital images from continuous sediment cores of Holocene age to outline a novel, deep-learning-based approach for performing automatic semantic segmentation directly on core images, leveraging the power of convolutional neural networks. We propose an automated model to rapidly characterize sediment cores, reproducing the sedimentologist's interpretation, and providing guidance for stratigraphic correlation and subsurface reconstructions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern High-Performance Computing HPC systems are gradually increasing in size and complexity due to the correspondent demand of larger simulations requiring more complicated tasks and higher accuracy. However, as side effects of the Dennard’s scaling approaching its ultimate power limit, the efficiency of software plays also an important role in increasing the overall performance of a computation. Tools to measure application performance in these increasingly complex environments provide insights into the intricate ways in which software and hardware interact. The monitoring of the power consumption in order to save energy is possible through processors interfaces like Intel Running Average Power Limit RAPL. Given the low level of these interfaces, they are often paired with an application-level tool like Performance Application Programming Interface PAPI. Since several problems in many heterogeneous fields can be represented as a complex linear system, an optimized and scalable linear system solver algorithm can decrease significantly the time spent to compute its resolution. One of the most widely used algorithms deployed for the resolution of large simulation is the Gaussian Elimination, which has its most popular implementation for HPC systems in the Scalable Linear Algebra PACKage ScaLAPACK library. However, another relevant algorithm, which is increasing in popularity in the academic field, is the Inhibition Method. This thesis compares the energy consumption of the Inhibition Method and Gaussian Elimination from ScaLAPACK to profile their execution during the resolution of linear systems above the HPC architecture offered by CINECA. Moreover, it also collates the energy and power values for different ranks, nodes, and sockets configurations. The monitoring tools employed to track the energy consumption of these algorithms are PAPI and RAPL, that will be integrated with the parallel execution of the algorithms managed with the Message Passing Interface MPI.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Understanding the molecular mechanisms of oral carcinogenesis will yield important advances in diagnostics, prognostics, effective treatment, and outcome of oral cancer. Hence, in this study we have investigated the proteomic and peptidomic profiles by combining an orthotopic murine model of oral squamous cell carcinoma (OSCC), mass spectrometry-based proteomics and biological network analysis. Our results indicated the up-regulation of proteins involved in actin cytoskeleton organization and cell-cell junction assembly events and their expression was validated in human OSCC tissues. In addition, the functional relevance of talin-1 in OSCC adhesion, migration and invasion was demonstrated. Taken together, this study identified specific processes deregulated in oral cancer and provided novel refined OSCC-targeting molecules.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A miniaturised gas analyser is described and evaluated based on the use of a substrate-integrated hollow waveguide (iHWG) coupled to a microsized near-infrared spectrophotometer comprising a linear variable filter and an array of InGaAs detectors. This gas sensing system was applied to analyse surrogate samples of natural fuel gas containing methane, ethane, propane and butane, quantified by using multivariate regression models based on partial least square (PLS) algorithms and Savitzky-Golay 1(st) derivative data preprocessing. The external validation of the obtained models reveals root mean square errors of prediction of 0.37, 0.36, 0.67 and 0.37% (v/v), for methane, ethane, propane and butane, respectively. The developed sensing system provides particularly rapid response times upon composition changes of the gaseous sample (approximately 2 s) due the minute volume of the iHWG-based measurement cell. The sensing system developed in this study is fully portable with a hand-held sized analyser footprint, and thus ideally suited for field analysis. Last but not least, the obtained results corroborate the potential of NIR-iHWG analysers for monitoring the quality of natural gas and petrochemical gaseous products.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development and maintenance of the sealing of the root canal system is the key to the success of root canal treatment. The resin-based adhesive material has the potential to reduce the microleakage of the root canal because of its adhesive properties and penetration into dentinal walls. Moreover, the irrigation protocols may have an influence on the adhesiveness of resin-based sealers to root dentin. The objective of the present study was to evaluate the effect of different irrigant protocols on coronal bacterial microleakage of gutta-percha/AH Plus and Resilon/Real Seal Self-etch systems. One hundred ninety pre-molars were used. The teeth were divided into 18 experimental groups according to the irrigation protocols and filling materials used. The protocols used were: distilled water; sodium hypochlorite (NaOCl)+eDTA; NaOCl+H3PO4; NaOCl+eDTA+chlorhexidine (CHX); NaOCl+H3PO4+CHX; CHX+eDTA; CHX+ H3PO4; CHX+eDTA+CHX and CHX+H3PO4+CHX. Gutta-percha/AH Plus or Resilon/Real Seal Se were used as root-filling materials. The coronal microleakage was evaluated for 90 days against Enterococcus faecalis. Data were statistically analyzed using Kaplan-Meier survival test, Kruskal-Wallis and Mann-Whitney tests. No significant difference was verified in the groups using chlorhexidine or sodium hypochlorite during the chemo-mechanical preparation followed by eDTA or phosphoric acid for smear layer removal. The same results were found for filling materials. However, the statistical analyses revealed that a final flush with 2% chlorhexidine reduced significantly the coronal microleakage. A final flush with 2% chlorhexidine after smear layer removal reduces coronal microleakage of teeth filled with gutta-percha/AH Plus or Resilon/Real Seal SE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To evaluate the effectiveness of Reciproc for the removal of cultivable bacteria and endotoxins from root canals in comparison with multifile rotary systems. The root canals of forty human single-rooted mandibular pre-molars were contaminated with an Escherichia coli suspension for 21 days and randomly assigned to four groups according to the instrumentation system: GI - Reciproc (VDW); GII - Mtwo (VDW); GIII - ProTaper Universal (Dentsply Maillefer); and GIV -FKG Race(™) (FKG Dentaire) (n = 10 per group). Bacterial and endotoxin samples were taken with a sterile/apyrogenic paper point before (s1) and after instrumentation (s2). Culture techniques determined the colony-forming units (CFU) and the Limulus Amebocyte Lysate assay was used for endotoxin quantification. Results were submitted to paired t-test and anova. At s1, bacteria and endotoxins were recovered in 100% of the root canals investigated (40/40). After instrumentation, all systems were associated with a highly significant reduction of the bacterial load and endotoxin levels, respectively: GI - Reciproc (99.34% and 91.69%); GII - Mtwo (99.86% and 83.11%); GIII - ProTaper (99.93% and 78.56%) and GIV - FKG Race(™) (99.99% and 82.52%) (P < 0.001). No statistical difference were found amongst the instrumentation systems regarding bacteria and endotoxin removal (P > 0.01). The reciprocating single file, Reciproc, was as effective as the multifile rotary systems for the removal of bacteria and endotoxins from root canals.