810 resultados para Technology acceptance model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the field of vehicle dynamics, commercial software can aid the designer during the conceptual and detailed design phases. Simulations using these tools can quickly provide specific design metrics, such as yaw and lateral velocity, for standard maneuvers. However, it remains challenging to correlate these metrics with empirical quantities that depend on many external parameters and design specifications. This scenario is the case with tire wear, which depends on the frictional work developed by the tire-road contact. In this study, an approach is proposed to estimate the tire-road friction during steady-state longitudinal and cornering maneuvers. Using this approach, a qualitative formula for tire wear evaluation is developed, and conceptual design analyses of cornering maneuvers are performed using simplified vehicle models. The influence of some design parameters such as cornering stiffness, the distance between the axles, and the steer angle ratio between the steering axles for vehicles with two steering axles is evaluated. The proposed methodology allows the designer to predict tire wear using simplified vehicle models during the conceptual design phase.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Borges JB, Suarez-Sipmann F, Bohm SH, Tusman G, Melo A, Maripuu E, Sandstrom M, Park M, Costa EL, Hedenstierna G, Amato M. Regional lung perfusion estimated by electrical impedance tomography in a piglet model of lung collapse. J Appl Physiol 112: 225-236, 2012. First published September 29, 2011; doi: 10.1152/japplphysiol.01090.2010.-The assessment of the regional match between alveolar ventilation and perfusion in critically ill patients requires simultaneous measurements of both parameters. Ideally, assessment of lung perfusion should be performed in real-time with an imaging technology that provides, through fast acquisition of sequential images, information about the regional dynamics or regional kinetics of an appropriate tracer. We present a novel electrical impedance tomography (EIT)-based method that quantitatively estimates regional lung perfusion based on first-pass kinetics of a bolus of hypertonic saline contrast. Pulmonary blood flow was measured in six piglets during control and unilateral or bilateral lung collapse conditions. The first-pass kinetics method showed good agreement with the estimates obtained by single-photon-emission computerized tomography (SPECT). The mean difference (SPECT minus EIT) between fractional blood flow to lung areas suffering atelectasis was -0.6%, with a SD of 2.9%. This method outperformed the estimates of lung perfusion based on impedance pulsatility. In conclusion, we describe a novel method based on EIT for estimating regional lung perfusion at the bedside. In both healthy and injured lung conditions, the distribution of pulmonary blood flow as assessed by EIT agreed well with the one obtained by SPECT. The method proposed in this study has the potential to contribute to a better understanding of the behavior of regional perfusion under different lung and therapeutic conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Maize is one of the most important crops in the world. The products generated from this crop are largely used in the starch industry, the animal and human nutrition sector, and biomass energy production and refineries. For these reasons, there is much interest in figuring the potential grain yield of maize genotypes in relation to the environment in which they will be grown, as the productivity directly affects agribusiness or farm profitability. Questions like these can be investigated with ecophysiological crop models, which can be organized according to different philosophies and structures. The main objective of this work is to conceptualize a stochastic model for predicting maize grain yield and productivity under different conditions of water supply while considering the uncertainties of daily climate data. Therefore, one focus is to explain the model construction in detail, and the other is to present some results in light of the philosophy adopted. A deterministic model was built as the basis for the stochastic model. The former performed well in terms of the curve shape of the above-ground dry matter over time as well as the grain yield under full and moderate water deficit conditions. Through the use of a triangular distribution for the harvest index and a bivariate normal distribution of the averaged daily solar radiation and air temperature, the stochastic model satisfactorily simulated grain productivity, i.e., it was found that 10,604 kg ha(-1) is the most likely grain productivity, very similar to the productivity simulated by the deterministic model and for the real conditions based on a field experiment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ni catalysts supported on gamma-Al2O3 modified by Rh and La were prepared and evaluated on the reforming of a model biogas. The catalysts were characterized by EDS, XRD, TPR, XANES and surface area estimation (BET). The results showed that in the original Ni catalyst, the Ni interacted strongly with the alumina support, exhibiting high reduction temperatures in TPR tests. In the catalytic tests, the addition of Rh on Ni catalysts improved CH4 conversion but also increased carbon deposition, possible by causing the segregation of Ni species under the reaction conditions. The presence of La on Ni catalysts reduced the carbon deposition by favoring the gasification of carbon species. Addition of synthetic air to the process improved the CH4 conversion and also decreased the carbon formation. The catalysts Ni, Rh-NiLa, and Rh showed good results in the conversion of model sulfur-free biogas, which suggests that they are promising catalysts to be tested in conversion of real biogas. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Magnetic hyperthermia is currently a clinical therapy approved in the European Union for treatment of tumor cells, and uses magnetic nanoparticles (MNPs) under time-varying magnetic fields (TVMFs). The same basic principle seems promising against trypanosomatids causing Chagas disease and sleeping sickness, given that the therapeutic drugs available have severe side effects and that there are drug-resistant strains. However, no applications of this strategy against protozoan-induced diseases have been reported so far. In the present study, Crithidia fasciculata, a widely used model for therapeutic strategies against pathogenic trypanosomatids, was targeted with Fe3O4 MNPs in order to provoke cell death remotely using TVMFs. Methods: Iron oxide MNPs with average diameters of approximately 30 nm were synthesized by precipitation of FeSO4 in basic medium. The MNPs were added to C. fasciculata choanomastigotes in the exponential phase and incubated overnight, removing excess MNPs using a DEAE-cellulose resin column. The amount of MNPs uploaded per cell was determined by magnetic measurement. The cells bearing MNPs were submitted to TVMFs using a homemade AC field applicator (f = 249 kHz, H = 13 kA/m), and the temperature variation during the experiments was measured. Scanning electron microscopy was used to assess morphological changes after the TVMF experiments. Cell viability was analyzed using an MTT colorimetric assay and flow cytometry. Results: MNPs were incorporated into the cells, with no noticeable cytotoxicity. When a TVMF was applied to cells bearing MNPs, massive cell death was induced via a nonapoptotic mechanism. No effects were observed by applying TVMF to control cells not loaded with MNPs. No macroscopic rise in temperature was observed in the extracellular medium during the experiments. Conclusion: As a proof of principle, these data indicate that intracellular hyperthermia is a suitable technology to induce death of protozoan parasites bearing MNPs. These findings expand the possibilities for new therapeutic strategies combating parasitic infection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We discuss a new interacting model for the cosmological dark sector in which the attenuated dilution of cold dark matter scales as a(-3)f(a), where f(a) is an arbitrary function of the cosmic scale factor a. From thermodynamic arguments, we show that f(a) is proportional to the entropy source of the particle creation process. In order to investigate the cosmological consequences of this kind of interacting models, we expand f(a) in a power series, and viable cosmological solutions are obtained. Finally, we use current observational data to place constraints on the interacting function f(a).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multivariate analyses of UV-Vis spectral data from cachaca wood extracts provide a simple and robust model to classify aged Brazilian cachacas according to the wood species used in the maturation barrels. The model is based on inspection of 93 extracts of oak and different Brazilian wood species by a non-aged cachaca used as an extraction solvent. Application of PCA (Principal Components Analysis) and HCA (Hierarchical Cluster Analysis) leads to identification of 6 clusters of cachaca wood extracts (amburana, amendoim, balsamo, castanheira, jatoba, and oak). LDA (Linear Discriminant Analysis) affords classification of 10 different wood species used in the cachaca extracts (amburana, amendoim, balsamo, cabreuva-parda, canela-sassafras, castanheira, jatoba, jequitiba-rosa, louro-canela, and oak) with an accuracy ranging from 80% (amendoim and castanheira) to 100% (balsamo and jequitiba-rosa). The methodology provides a low-cost alternative to methods based on liquid chromatography and mass spectrometry to classify cachacas aged in barrels that are composed of different wood species.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The uplift capacity of helical anchors normally increases with the number of helical plates. The rate of capacity gain is variable, considering that the disturbance caused by the anchor installation is generally more pronounced in the soil mass above the upper plates than above the lower plates, because the upper soil layers are penetrated more times. The present investigation examines the effect of the number of helices on the performance of helical anchors in sand, based on the results of centrifuge model tests. Uplift loading tests were performed on 12 different types of piles installed in two containers of dry sand prepared with different densities. The measured fractions of the uplift capacity related to each individual helical plate of multi-helix anchors were compared with the fractions predicted by the individual bearing method. The results of this investigation indicate that in double- and triple-helix anchors, the contributions of the second and third plate to the total anchor uplift capacity decreased with the increase of sand relative density and plate diameter. In addition, these experiments demonstrated that the variation of the anchor load-displacement behavior with the number of helices also depends on these parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aspects related to the users' cooperative work are not considered in the traditional approach of software engineering, since the user is viewed independently of his/her workplace environment or group, with the individual model generalized to the study of collective behavior of all users. This work proposes a process for software requirements to address issues involving cooperative work in information systems that provide distributed coordination in the users' actions and the communication among them occurs indirectly through the data entered while using the software. To achieve this goal, this research uses ergonomics, the 3C cooperation model, awareness and software engineering concepts. Action-research is used as a research methodology applied in three cycles during the development of a corporate workflow system in a technological research company. This article discusses the third cycle, which corresponds to the process that deals with the refinement of the cooperative work requirements with the software in actual use in the workplace, where the inclusion of a computer system changes the users' workplace, from the face to face interaction to the interaction mediated by the software. The results showed that the highest degree of users' awareness about their activities and other system users contribute to a decrease in their errors and in the inappropriate use of the system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cutting tools with higher wear resistance are those manufactured by powder metallurgy process, which combines the development of materials and design properties, features of shape-making technology and sintering. The annual global market of cutting tools consumes about US$ 12 billion; therefore, any research to improve tool designs and machining process techniques adds value or reduces costs. The aim is to describe the Spark Plasma Sintering (SPS) of cutting tools in functionally gradient materials, to show this structure design suitability through thermal residual stress model and, lastly, to present two kinds of inserts. For this, three cutting tool materials were used (Al2O3-ZrO2, Al2O3-TiC and WC-Co). The samples were sintered by SPS at 1300 °C and 70 MPa. The results showed that mechanical and thermal displacements may be separated during thermal treatment for analysis. Besides, the absence of cracks indicated coherence between experimental results and the residual stresses predicted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction 1.1 Occurrence of polycyclic aromatic hydrocarbons (PAH) in the environment Worldwide industrial and agricultural developments have released a large number of natural and synthetic hazardous compounds into the environment due to careless waste disposal, illegal waste dumping and accidental spills. As a result, there are numerous sites in the world that require cleanup of soils and groundwater. Polycyclic aromatic hydrocarbons (PAHs) are one of the major groups of these contaminants (Da Silva et al., 2003). PAHs constitute a diverse class of organic compounds consisting of two or more aromatic rings with various structural configurations (Prabhu and Phale, 2003). Being a derivative of benzene, PAHs are thermodynamically stable. In addition, these chemicals tend to adhere to particle surfaces, such as soils, because of their low water solubility and strong hydrophobicity, and this results in greater persistence under natural conditions. This persistence coupled with their potential carcinogenicity makes PAHs problematic environmental contaminants (Cerniglia, 1992; Sutherland, 1992). PAHs are widely found in high concentrations at many industrial sites, particularly those associated with petroleum, gas production and wood preserving industries (Wilson and Jones, 1993). 1.2 Remediation technologies Conventional techniques used for the remediation of soil polluted with organic contaminants include excavation of the contaminated soil and disposal to a landfill or capping - containment - of the contaminated areas of a site. These methods have some drawbacks. The first method simply moves the contamination elsewhere and may create significant risks in the excavation, handling and transport of hazardous material. Additionally, it is very difficult and increasingly expensive to find new landfill sites for the final disposal of the material. The cap and containment method is only an interim solution since the contamination remains on site, requiring monitoring and maintenance of the isolation barriers long into the future, with all the associated costs and potential liability. A better approach than these traditional methods is to completely destroy the pollutants, if possible, or transform them into harmless substances. Some technologies that have been used are high-temperature incineration and various types of chemical decomposition (for example, base-catalyzed dechlorination, UV oxidation). However, these methods have significant disadvantages, principally their technological complexity, high cost , and the lack of public acceptance. Bioremediation, on the contrast, is a promising option for the complete removal and destruction of contaminants. 1.3 Bioremediation of PAH contaminated soil & groundwater Bioremediation is the use of living organisms, primarily microorganisms, to degrade or detoxify hazardous wastes into harmless substances such as carbon dioxide, water and cell biomass Most PAHs are biodegradable unter natural conditions (Da Silva et al., 2003; Meysami and Baheri, 2003) and bioremediation for cleanup of PAH wastes has been extensively studied at both laboratory and commercial levels- It has been implemented at a number of contaminated sites, including the cleanup of the Exxon Valdez oil spill in Prince William Sound, Alaska in 1989, the Mega Borg spill off the Texas coast in 1990 and the Burgan Oil Field, Kuwait in 1994 (Purwaningsih, 2002). Different strategies for PAH bioremediation, such as in situ , ex situ or on site bioremediation were developed in recent years. In situ bioremediation is a technique that is applied to soil and groundwater at the site without removing the contaminated soil or groundwater, based on the provision of optimum conditions for microbiological contaminant breakdown.. Ex situ bioremediation of PAHs, on the other hand, is a technique applied to soil and groundwater which has been removed from the site via excavation (soil) or pumping (water). Hazardous contaminants are converted in controlled bioreactors into harmless compounds in an efficient manner. 1.4 Bioavailability of PAH in the subsurface Frequently, PAH contamination in the environment is occurs as contaminants that are sorbed onto soilparticles rather than in phase (NAPL, non aqueous phase liquids). It is known that the biodegradation rate of most PAHs sorbed onto soil is far lower than rates measured in solution cultures of microorganisms with pure solid pollutants (Alexander and Scow, 1989; Hamaker, 1972). It is generally believed that only that fraction of PAHs dissolved in the solution can be metabolized by microorganisms in soil. The amount of contaminant that can be readily taken up and degraded by microorganisms is defined as bioavailability (Bosma et al., 1997; Maier, 2000). Two phenomena have been suggested to cause the low bioavailability of PAHs in soil (Danielsson, 2000). The first one is strong adsorption of the contaminants to the soil constituents which then leads to very slow release rates of contaminants to the aqueous phase. Sorption is often well correlated with soil organic matter content (Means, 1980) and significantly reduces biodegradation (Manilal and Alexander, 1991). The second phenomenon is slow mass transfer of pollutants, such as pore diffusion in the soil aggregates or diffusion in the organic matter in the soil. The complex set of these physical, chemical and biological processes is schematically illustrated in Figure 1. As shown in Figure 1, biodegradation processes are taking place in the soil solution while diffusion processes occur in the narrow pores in and between soil aggregates (Danielsson, 2000). Seemingly contradictory studies can be found in the literature that indicate the rate and final extent of metabolism may be either lower or higher for sorbed PAHs by soil than those for pure PAHs (Van Loosdrecht et al., 1990). These contrasting results demonstrate that the bioavailability of organic contaminants sorbed onto soil is far from being well understood. Besides bioavailability, there are several other factors influencing the rate and extent of biodegradation of PAHs in soil including microbial population characteristics, physical and chemical properties of PAHs and environmental factors (temperature, moisture, pH, degree of contamination). Figure 1: Schematic diagram showing possible rate-limiting processes during bioremediation of hydrophobic organic contaminants in a contaminated soil-water system (not to scale) (Danielsson, 2000). 1.5 Increasing the bioavailability of PAH in soil Attempts to improve the biodegradation of PAHs in soil by increasing their bioavailability include the use of surfactants , solvents or solubility enhancers.. However, introduction of synthetic surfactant may result in the addition of one more pollutant. (Wang and Brusseau, 1993).A study conducted by Mulder et al. showed that the introduction of hydropropyl-ß-cyclodextrin (HPCD), a well-known PAH solubility enhancer, significantly increased the solubilization of PAHs although it did not improve the biodegradation rate of PAHs (Mulder et al., 1998), indicating that further research is required in order to develop a feasible and efficient remediation method. Enhancing the extent of PAHs mass transfer from the soil phase to the liquid might prove an efficient and environmentally low-risk alternative way of addressing the problem of slow PAH biodegradation in soil.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays licensing practices have increased in importance and relevance driving the widespread diffusion of markets for technologies. Firms are shifting from a tactical to a strategic attitude towards licensing, addressing both business and corporate level objectives. The Open Innovation Paradigm has been embraced. Firms rely more and more on collaboration and external sourcing of knowledge. This new model of innovation requires firms to leverage on external technologies to unlock the potential of firms’ internal innovative efforts. In this context, firms’ competitive advantage depends both on their ability to recognize available opportunities inside and outside their boundaries and on their readiness to exploit them in order to fuel their innovation process dynamically. Licensing is one of the ways available to firm to ripe the advantages associated to an open attitude in technology strategy. From the licensee’s point view this implies challenging the so-called not-invented-here syndrome, affecting the more traditional firms that emphasize the myth of internal research and development supremacy. This also entails understanding the so-called cognitive constraints affecting the perfect functioning of markets for technologies that are associated to the costs for the assimilation, integration and exploitation of external knowledge by recipient firms. My thesis aimed at shedding light on new interesting issues associated to in-licensing activities that have been neglected by the literature on licensing and markets for technologies. The reason for this gap is associated to the “perspective bias” affecting the works within this stream of research. With very few notable exceptions, they have been generally concerned with the investigation of the so-called licensing dilemma of the licensor – whether to license out or to internally exploit the in-house developed technologies, while neglecting the licensee’s perspective. In my opinion, this has left rooms for improving the understanding of the determinants and conditions affecting licensing-in practices. From the licensee’s viewpoint, the licensing strategy deals with the search, integration, assimilation, exploitation of external technologies. As such it lies at the very hearth of firm’s technology strategy. Improving our understanding of this strategy is thus required to assess the full implications of in-licensing decisions as they shape firms’ innovation patterns and technological capabilities evolution. It also allow for understanding the so-called cognitive constraints associated to the not-invented-here syndrome. In recognition of that, the aim of my work is to contribute to the theoretical and empirical literature explaining the determinants of the licensee’s behavior, by providing a comprehensive theoretical framework as well as ad-hoc conceptual tools to understand and overcome frictions and to ease the achievement of satisfactory technology transfer agreements in the marketplace. Aiming at this, I investigate licensing-in in three different fashions developed in three research papers. In the first work, I investigate the links between licensing and the patterns of firms’ technological search diversification according to the framework of references of the Search literature, Resource-based Theory and the theory of general purpose technologies. In the second paper - that continues where the first one left off – I analyze the new concept of learning-bylicensing, in terms of development of new knowledge inside the licensee firms (e.g. new patents) some years after the acquisition of the license, according to the Dynamic Capabilities perspective. Finally, in the third study, Ideal with the determinants of the remuneration structure of patent licenses (form and amount), and in particular on the role of the upfront fee from the licensee’s perspective. Aiming at this, I combine the insights of two theoretical approaches: agency and real options theory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The presented study carried out an analysis on rural landscape changes. In particular the study focuses on the understanding of driving forces acting on the rural built environment using a statistical spatial model implemented through GIS techniques. It is well known that the study of landscape changes is essential for a conscious decision making in land planning. From a bibliography review results a general lack of studies dealing with the modeling of rural built environment and hence a theoretical modelling approach for such purpose is needed. The advancement in technology and modernity in building construction and agriculture have gradually changed the rural built environment. In addition, the phenomenon of urbanization of a determined the construction of new volumes that occurred beside abandoned or derelict rural buildings. Consequently there are two types of transformation dynamics affecting mainly the rural built environment that can be observed: the conversion of rural buildings and the increasing of building numbers. It is the specific aim of the presented study to propose a methodology for the development of a spatial model that allows the identification of driving forces that acted on the behaviours of the building allocation. In fact one of the most concerning dynamic nowadays is related to an irrational expansion of buildings sprawl across landscape. The proposed methodology is composed by some conceptual steps that cover different aspects related to the development of a spatial model: the selection of a response variable that better describe the phenomenon under study, the identification of possible driving forces, the sampling methodology concerning the collection of data, the most suitable algorithm to be adopted in relation to statistical theory and method used, the calibration process and evaluation of the model. A different combination of factors in various parts of the territory generated favourable or less favourable conditions for the building allocation and the existence of buildings represents the evidence of such optimum. Conversely the absence of buildings expresses a combination of agents which is not suitable for building allocation. Presence or absence of buildings can be adopted as indicators of such driving conditions, since they represent the expression of the action of driving forces in the land suitability sorting process. The existence of correlation between site selection and hypothetical driving forces, evaluated by means of modeling techniques, provides an evidence of which driving forces are involved in the allocation dynamic and an insight on their level of influence into the process. GIS software by means of spatial analysis tools allows to associate the concept of presence and absence with point futures generating a point process. Presence or absence of buildings at some site locations represent the expression of these driving factors interaction. In case of presences, points represent locations of real existing buildings, conversely absences represent locations were buildings are not existent and so they are generated by a stochastic mechanism. Possible driving forces are selected and the existence of a causal relationship with building allocations is assessed through a spatial model. The adoption of empirical statistical models provides a mechanism for the explanatory variable analysis and for the identification of key driving variables behind the site selection process for new building allocation. The model developed by following the methodology is applied to a case study to test the validity of the methodology. In particular the study area for the testing of the methodology is represented by the New District of Imola characterized by a prevailing agricultural production vocation and were transformation dynamic intensively occurred. The development of the model involved the identification of predictive variables (related to geomorphologic, socio-economic, structural and infrastructural systems of landscape) capable of representing the driving forces responsible for landscape changes.. The calibration of the model is carried out referring to spatial data regarding the periurban and rural area of the study area within the 1975-2005 time period by means of Generalised linear model. The resulting output from the model fit is continuous grid surface where cells assume values ranged from 0 to 1 of probability of building occurrences along the rural and periurban area of the study area. Hence the response variable assesses the changes in the rural built environment occurred in such time interval and is correlated to the selected explanatory variables by means of a generalized linear model using logistic regression. Comparing the probability map obtained from the model to the actual rural building distribution in 2005, the interpretation capability of the model can be evaluated. The proposed model can be also applied to the interpretation of trends which occurred in other study areas, and also referring to different time intervals, depending on the availability of data. The use of suitable data in terms of time, information, and spatial resolution and the costs related to data acquisition, pre-processing, and survey are among the most critical aspects of model implementation. Future in-depth studies can focus on using the proposed model to predict short/medium-range future scenarios for the rural built environment distribution in the study area. In order to predict future scenarios it is necessary to assume that the driving forces do not change and that their levels of influence within the model are not far from those assessed for the time interval used for the calibration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last few years the resolution of numerical weather prediction (nwp) became higher and higher with the progresses of technology and knowledge. As a consequence, a great number of initial data became fundamental for a correct initialization of the models. The potential of radar observations has long been recognized for improving the initial conditions of high-resolution nwp models, while operational application becomes more frequent. The fact that many nwp centres have recently taken into operations convection-permitting forecast models, many of which assimilate radar data, emphasizes the need for an approach to providing quality information which is needed in order to avoid that radar errors degrade the model's initial conditions and, therefore, its forecasts. Environmental risks can can be related with various causes: meteorological, seismical, hydrological/hydraulic. Flash floods have horizontal dimension of 1-20 Km and can be inserted in mesoscale gamma subscale, this scale can be modeled only with nwp model with the highest resolution as the COSMO-2 model. One of the problems of modeling extreme convective events is related with the atmospheric initial conditions, in fact the scale dimension for the assimilation of atmospheric condition in an high resolution model is about 10 Km, a value too high for a correct representation of convection initial conditions. Assimilation of radar data with his resolution of about of Km every 5 or 10 minutes can be a solution for this problem. In this contribution a pragmatic and empirical approach to deriving a radar data quality description is proposed to be used in radar data assimilation and more specifically for the latent heat nudging (lhn) scheme. Later the the nvective capabilities of the cosmo-2 model are investigated through some case studies. Finally, this work shows some preliminary experiments of coupling of a high resolution meteorological model with an Hydrological one.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the last decade peach and nectarine fruit have lost considerable market share, due to increased consumer dissatisfaction with quality at retail markets. This is mainly due to harvesting of too immature fruit and high ripening heterogeneity. The main problem is that the traditional used maturity indexes are not able to objectively detect fruit maturity stage, neither the variability present in the field, leading to a difficult post-harvest management of the product and to high fruit losses. To assess more precisely the fruit ripening other techniques and devices can be used. Recently, a new non-destructive maturity index, based on the vis-NIR technology, the Index of Absorbance Difference (IAD), that correlates with fruit degreening and ethylene production, was introduced and the IAD was used to study peach and nectarine fruit ripening from the “field to the fork”. In order to choose the best techniques to improve fruit quality, a detailed description of the tree structure, of fruit distribution and ripening evolution on the tree was faced. More in details, an architectural model (PlantToon®) was used to design the tree structure and the IAD was applied to characterize the maturity stage of each fruit. Their combined use provided an objective and precise evaluation of the fruit ripening variability, related to different training systems, crop load, fruit exposure and internal temperature. Based on simple field assessment of fruit maturity (as IAD) and growth, a model for an early prediction of harvest date and yield, was developed and validated. The relationship between the non-destructive maturity IAD, and the fruit shelf-life, was also confirmed. Finally the obtained results were validated by consumer test: the fruit sorted in different maturity classes obtained a different consumer acceptance. The improved knowledge, leaded to an innovative management of peach and nectarine fruit, from “field to market”.