891 resultados para individual zones of optimal functioning model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Edible flowers are being used in culinary preparations to improve the sensorial and nutritional qualities of food, besides improving human health due to the profusion in bioactive compounds [1]. Nevertheless, edible flowers are highly perishable and must be free of insects, which is difficult because they are usually cultivated without using pesticides [2]. Food irradiation is an economically viable technology to extend shelf life of foods, improving their hygiene and quality, while disinfesting insects [3]. The efficiency and safety of radiation processing (using Co-60 or electronaccelerators) have been approved by legal authorities (FDA, USDA, WHO, FAO), as also by the scientific community, based on extensive research [4]. Viola tricolor L. (heartseases), from Violaceae family, is one of the most popular edible flowers. Apart from being used as food, it has also been applied for its medicinal properties, mainly due to their biological activity and phenolic composition [5]. Herein, the phenolic compounds were analyzed by HPLC-DAD-ESI/MS and linear discriminant analysis (LDA) was performed to compare the results from flowers submitted to different irradiation doses and technologies (Co-60 and electron-beam). Quercetin-3-O-(6-O-rhamnosylglucoside)-7-O-rhamnoside (Figure 1) was the most abundant compound, followed by quercetin-3-O-rutinoside and acetyl-quercetin-3-O (6-O-rhamnosylglucoside)-7-O-rhamnoside. In general, irradiated samples (mostly with 1 kGy) showed the highest phenolic compounds content. The LDA outcomes indicated that differences among phenolic compounds effectively discriminate the assayed doses and technologies, defining which variables contributed mostly to that separation. This information might be useful to define which dose and/or technology optimizes the content in a specific phenolic compound. Overall, irradiation did not negatively affect the levels of phenolic compounds, providing the possibility of its application to expand the shelf life of V. tricolor and highlighting new commercial solutions for this functional food.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Complex singlet extension of the Standard Model (CxSM) is the simplest extension that provides scenarios for Higgs pair production with different masses. The model has two interesting phases: the dark matter phase, with a Standard Model-like Higgs boson, a new scalar and a dark matter candidate; and the broken phase, with all three neutral scalars mixing. In the latter phase Higgs decays into a pair of two different Higgs bosons are possible. In this study we analyse Higgs-to-Higgs decays in the framework of singlet extensions of the Standard Model (SM), with focus on the CxSM. After demonstrating that scenarios with large rates for such chain decays are possible we perform a comparison between the NMSSM and the CxSM. We find that, based on Higgs-to-Higgs decays, the only possibility to distinguish the two models at the LHC run 2 is through final states with two different scalars. This conclusion builds a strong case for searches for final states with two different scalars at the LHC run 2. Finally, we propose a set of benchmark points for the real and complex singlet extensions to be tested at the LHC run 2. They have been chosen such that the discovery prospects of the involved scalars are maximised and they fulfil the dark matter constraints. Furthermore, for some of the points the theory is stable up to high energy scales. For the computation of the decay widths and branching ratios we developed the Fortran code sHDECAY, which is based on the implementation of the real and complex singlet extensions of the SM in HDECAY.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main aim of this study was to determine the impact of innovation on productivity in service sector companies — especially those in the hospitality sector — that value the reduction of environmental impact as relevant to the innovation process. We used a structural analysis model based on the one developed by Crépon, Duguet, and Mairesse (1998). This model is known as the CDM model (an acronym of the authors’ surnames). These authors developed seminal studies in the field of the relationships between innovation and productivity (see Griliches 1979; Pakes and Grilliches 1980). The main advantage of the CDM model is its ability to integrate the process of innovation and business productivity from an empirical perspective.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The selection of the optimal operating conditions for an industrial acrylonitrile recovery unit was conducted by the systematic application of the response surface methodology, based on the minimum energy consumption and products specifications as process constraints. Unit models and plant simulation were validated against operating data and information. A sensitivity analysis was carried out in order to identify the set of parameters that strongly affect the trajectories of the system while keeping products specifications. The results suggest that energy savings of up to 10% are possible by systematically adjusting operating conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Ocean Model Intercomparison Project (OMIP) is an endorsed project in the Coupled Model Intercomparison Project Phase 6 (CMIP6). OMIP addresses CMIP6 science questions, investigating the origins and consequences of systematic model biases. It does so by providing a framework for evaluating (including assessment of systematic biases), understanding, and improving ocean, sea-ice, tracer, and biogeochemical components of climate and earth system models contributing to CMIP6. Among the WCRP Grand Challenges in climate science (GCs), OMIP primarily contributes to the regional sea level change and near-term (climate/decadal) prediction GCs. OMIP provides (a) an experimental protocol for global ocean/sea-ice models run with a prescribed atmospheric forcing; and (b) a protocol for ocean diagnostics to be saved as part of CMIP6. We focus here on the physical component of OMIP, with a companion paper (Orr et al., 2016) detailing methods for the inert chemistry and interactive biogeochemistry. The physical portion of the OMIP experimental protocol follows the interannual Coordinated Ocean-ice Reference Experiments (CORE-II). Since 2009, CORE-I (Normal Year Forcing) and CORE-II (Interannual Forcing) have become the standard methods to evaluate global ocean/sea-ice simulations and to examine mechanisms for forced ocean climate variability. The OMIP diagnostic protocol is relevant for any ocean model component of CMIP6, including the DECK (Diagnostic, Evaluation and Characterization of Klima experiments), historical simulations, FAFMIP (Flux Anomaly Forced MIP), C4MIP (Coupled Carbon Cycle Climate MIP), DAMIP (Detection and Attribution MIP), DCPP (Decadal Climate Prediction Project), ScenarioMIP, HighResMIP (High Resolution MIP), as well as the ocean/sea-ice OMIP simulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Part 6: Engineering and Implementation of Collaborative Networks

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a harmonised framework of sediment quality assessment and dredging material characterisation for estuaries and port zones of North and South Atlantic. This framework, based on the weight-of-evidence approach, provides a structure and a process for conducting sediment/dredging material assessment that leads to a decision. The main structure consists of step 1 (examination of available data); step 2 (chemical characterisation and toxicity assessment); decision 1 (any chemical level higher than reference values? are sediments toxic?); step 3 (assessment of benthic community structure); step 4 (integration of the results); decision 2 (are sediments toxic or benthic community impaired?); step 5 (construction of the decision matrix) and decision 3 (is there environmental risk?). The sequence of assessments may be interrupted when the information obtained is judged to be sufficient for a correct characterisation of the risk posed by the sediments/dredging material. This framework brought novel features compared to other sediment/dredging material risk assessment frameworks: data integration through multivariate analysis allows the identification of which samples are toxic and/or related to impaired benthic communities; it also discriminates the chemicals responsible for negative biological effects; and the framework dispenses the use of a reference area. We demonstrated the successful application of this framework in different port and estuarine zones of the North (Gulf of Cadiz) and South Atlantic (Santos and Paranagua Estuarine Systems).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the presented thesis work, the meshfree method with distance fields was coupled with the lattice Boltzmann method to obtain solutions of fluid-structure interaction problems. The thesis work involved development and implementation of numerical algorithms, data structure, and software. Numerical and computational properties of the coupling algorithm combining the meshfree method with distance fields and the lattice Boltzmann method were investigated. Convergence and accuracy of the methodology was validated by analytical solutions. The research was focused on fluid-structure interaction solutions in complex, mesh-resistant domains as both the lattice Boltzmann method and the meshfree method with distance fields are particularly adept in these situations. Furthermore, the fluid solution provided by the lattice Boltzmann method is massively scalable, allowing extensive use of cutting edge parallel computing resources to accelerate this phase of the solution process. The meshfree method with distance fields allows for exact satisfaction of boundary conditions making it possible to exactly capture the effects of the fluid field on the solid structure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Existing parsers for textual model representation formats such as XMI and HUTN are unforgiving and fail upon even the smallest inconsistency between the structure and naming of metamodel elements and the contents of serialised models. In this paper, we demonstrate how a fuzzy parsing approach can transparently and automatically resolve a number of these inconsistencies, and how it can eventually turn XML into a human-readable and editable textual model representation format for particular classes of models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The study of acoustic communication in animals often requires not only the recognition of species specific acoustic signals but also the identification of individual subjects, all in a complex acoustic background. Moreover, when very long recordings are to be analyzed, automatic recognition and identification processes are invaluable tools to extract the relevant biological information. A pattern recognition methodology based on hidden Markov models is presented inspired by successful results obtained in the most widely known and complex acoustical communication signal: human speech. This methodology was applied here for the first time to the detection and recognition of fish acoustic signals, specifically in a stream of round-the-clock recordings of Lusitanian toadfish (Halobatrachus didactylus) in their natural estuarine habitat. The results show that this methodology is able not only to detect the mating sounds (boatwhistles) but also to identify individual male toadfish, reaching an identification rate of ca. 95%. Moreover this method also proved to be a powerful tool to assess signal durations in large data sets. However, the system failed in recognizing other sound types.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Assessment processes are essential to guarantee quality and continuous improvement of software in healthcare, as they measure software attributes in their lifecycle, verify the degree of alignment between the software and its objectives and identify unpredicted events. This article analyses the use of an assessment model based on software metrics for three healthcare information systems from a public hospital that provides secondary and tertiary care in the region of Ribeirão Preto. Compliance with the metrics was investigated using questionnaires in guided interviews of the system analysts responsible for the applications. The outcomes indicate that most of the procedures specified in the model can be adopted to assess the systems that serves the organization, particularly in the attributes of compatibility, reliability, safety, portability and usability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Similar to what occurs in Human Medicine, also in Veterinary Medicine, the prevalence of oncological diseases has significantly increased. The evolution of Veterinary Medicine, in last decades has brought changes in clinical paradigms, particularly concerning the relationship with the animal and also with the owner. More than any other specialty, members of the Veterinary Medical Team that work in the oncology field, are unavoidably forced to break bad news. This paper proposes the adaptation of the ABCDE model from Human Medicine to Veterinary Medicine. The adaptation of the ABCDE model for Veterinary Medicine improves communication with the owner and offers all the members of the Veterinary Medical Team better communication skills.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: Food availability and access are strongly affected by seasonality in Ethiopia. However, there are little data on seasonal variation in Infant and Young Child Feeding (IYCF) practices and malnutrition among 6-23 months old children in different agro-ecological zones of rural Ethiopia. Methods: Socio-demographic, anthropometry and IYCF indicators were assessed in post- and pre-harvest seasons among children aged 6–23 months of age randomly selected from rural villages of lowland and midland agro-ecological zones. Results: Child stunting and underweight increased from prevalence of 39.8% and 26.9% in post-harvest to 46.0% and 31.8% in pre-harvest seasons, respectively. The biggest increase in prevalence of stunting and underweight between post- and pre-harvest seasons was noted in the midland zone. Wasting decreased from 11.6% post-harvest to 8.5% pre-harvest, with the biggest decline recorded in the lowland zone. Minimum meal frequency, minimum acceptable diet and poor dietary diversity increased considerably in pre-harvest compared to post-harvest season in the lowland zone. Feeding practices and maternal age were predictors of wasting, while women’s dietary diversity and children age was predictor of child dietary diversity in both seasons. Conclusion: There is seasonal variation in malnutrition and IYCF practices among children 6-23 months of age with more pronounced effect in midland agro-ecological zone. A major contributing factor for child malnutrition may be poor feeding practices. Health information strategies focused on both IYCF practices and dietary diversity of mothers could be a sensible approach to reduce the burden of child malnutrition in rural Ethiopia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

On most if not all evaluatively relevant dimensions such as the temperature level, taste intensity, and nutritional value of a meal, one range of adequate, positive states is framed by two ranges of inadequate, negative states, namely too much and too little. This distribution of positive and negative states in the information ecology results in a higher similarity of positive objects, people, and events to other positive stimuli as compared to the similarity of negative stimuli to other negative stimuli. In other words, there are fewer ways in which an object, a person, or an event can be positive as compared to negative. Oftentimes, there is only one way in which a stimulus can be positive (e.g., a good meal has to have an adequate temperature level, taste intensity, and nutritional value). In contrast, there are many different ways in which a stimulus can be negative (e.g., a bad meal can be too hot or too cold, too spicy or too bland, or too fat or too lean). This higher similarity of positive as compared to negative stimuli is important, as similarity greatly impacts speed and accuracy on virtually all levels of information processing, including attention, classification, categorization, judgment and decision making, and recognition and recall memory. Thus, if the difference in similarity between positive and negative stimuli is a general phenomenon, it predicts and may explain a variety of valence asymmetries in cognitive processing (e.g., positive as compared to negative stimuli are processed faster but less accurately). In my dissertation, I show that the similarity asymmetry is indeed a general phenomenon that is observed in thousands of words and pictures. Further, I show that the similarity asymmetry applies to social groups. Groups stereotyped as average on the two dimensions agency / socio-economic success (A) and conservative-progressive beliefs (B) are stereotyped as positive or high on communion (C), while groups stereotyped as extreme on A and B (e.g., managers, homeless people, punks, and religious people) are stereotyped as negative or low on C. As average groups are more similar to one another than extreme groups, according to this ABC model of group stereotypes, positive groups are mentally represented as more similar to one another than negative groups. Finally, I discuss implications of the ABC model of group stereotypes, pointing to avenues for future research on how stereotype content shapes social perception, cognition, and behavior.