20 resultados para Controllability of systems
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo
Resumo:
The demand for "welfare friendly" products increases as public conscience and perception on livestock production systems grow. The public and policy-makers demand scientific information for education and to guide decision processes. This paper describes some of the last decade contributions made by scientists on the technical, economical and market areas of farm animal welfare. Articles on animal welfare were compiled on the following themes: 1) consumer behavior, 2) technical and economical viability, 3) public regulation, and 4) private certification policies. Most studies on the economic evaluation of systems that promote animal welfare involved species destined to produce export items, such as eggs, beef and pork. Few studies were found on broilers, dairy cows and fish, and data regarding other species, such as horses, sheep and goats were not found. Scientists understand that farm animal welfare is not only a matter of ethics, but also an essential tool to gain and maintain markets. However, it is unfortunate that little attention is paid to species that are not economically important for exports. Studies that emphasize on more humane ways to raise animals and that provide economic incentives to the producer are needed. An integrated multidisciplinary approach is necessary to highlight the benefits of introducing animal welfare techniques to existing production systems.
Resumo:
The liquid-liquid equilibria of systems composed of rice bran oil, free fatty acids, ethanol and water were investigated at temperatures ranging from 10 to 60 degrees C. The results of the present study indicated that the mutual solubility of the compounds decreased with an increase in the water content of the solvent and a decrease in the temperature of the solution. The experimental data set was correlated by applying the UNIQUAC model. The average variance between the experimental and calculated compositions was 0.35%, indicating that the model can accurately predict behavior of the compounds at different temperatures and degrees of hydration. The adjustment of interaction parameters enables both the simulation of liquid-liquid extractors for deacidification of vegetable oil and the prediction of phase compositions for the oil and alcohol-rich phases that are generated during cooling of the stream exiting the extractor (when using ethanol as the solvent). (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
Silver/alanine nanocomposites with varying mass percentage of silver have been produced. The size of the silver nanoparticles seems to drive the formation of the nanocomposite, yielding a homogeneous dispersion of the silver nanoparticles in the alanine matrix or flocs of silver nanoparticles segregated from the alanine crystals. The alanine crystalline orientation is modified according to the particle size of the silver nanoparticles. Concerning a mass percentage of silver below 0.1%, the nanocomposites are homogeneous, and there is no particle aggregation. As the mass percentage of silver is increased, the system becomes unstable, and there is particle flocculation with subsequent segregation of the alanine crystals. The nanocomposites have been analyzed by transmission electron microscopy (TEM), UV-Vis absorption spectroscopy, X-ray diffraction (XRD), and Fourier transform infrared (FTIR) spectroscopy and they have been tested as radiation detectors by means of electron spin resonance (ESR) spectroscopy in order to detect the paramagnetic centers created by the radiation. In fact, the sensitivity of the radiation detectors is optimized in the case of systems containing small particles (30 nm) that are well dispersed in the alanine matrix. As the agglomeration increases, particle growth (up to 1.5 mu m) and segregation diminish the sensitivity. In conclusion, nanostructured materials can be used for optimization of alanine sensitivity, by taking into account the influence of the particles size of the silver nanoparticles on the detection properties of the alanine radiation detectors, thus contributing to the construction of small-sized detectors.
Resumo:
The class of electrochemical oscillators characterized by a partially hidden negative differential resistance in an N-shaped current potential curve encompasses a myriad of experimental examples. We present a comprehensive methodological analysis of the oscillation frequency of this class of systems and discuss its dependence on electrical and kinetic parameters. The analysis is developed from a skeleton ordinary differential equation model, and an equation for the oscillation frequency is obtained. Simulations are carried out for a model system, namely, the nickel electrodissolution, and the numerical results are confirmed by experimental data on this system. In addition, the treatment is further applied to the electro-oxidation of ethylene glycol where unusually large oscillation frequencies have been reported. Despite the distinct chemistry underlying the oscillatory dynamics of these systems, a very good agreement between experiments and theoretical predictions is observed. The application of the developed theory is suggested as an important step for primary kinetic characterization.
Resumo:
Metadata is data that fully describes the data and the areas they represent, allowing the user to decide on their use as best as possible. Allow reporting on the existence of a set of data linked to specific needs. The use of metadata has the purpose of documenting and organizing a structured organizational data in order to minimize duplication of efforts to locate them and to facilitate maintenance. It also provides the administration of large amounts of data, discovery, retrieval and editing features. The global use of metadata is regulated by a technical group or task force composed of several segments such as industries, universities and research firms. Agriculture in particular is a good example for the development of typical applications using metadata is the integration of systems and equipment, allowing the implementation of techniques used in precision agriculture, the integration of different computer systems via webservices or other type of solution requires the integration of structured data. The purpose of this paper is to present an overview of the standards of metadata areas consolidated as agricultural.
Resumo:
This work addresses the solution to the problem of robust model predictive control (MPC) of systems with model uncertainty. The case of zone control of multi-variable stable systems with multiple time delays is considered. The usual approach of dealing with this kind of problem is through the inclusion of non-linear cost constraint in the control problem. The control action is then obtained at each sampling time as the solution to a non-linear programming (NLP) problem that for high-order systems can be computationally expensive. Here, the robust MPC problem is formulated as a linear matrix inequality problem that can be solved in real time with a fraction of the computer effort. The proposed approach is compared with the conventional robust MPC and tested through the simulation of a reactor system of the process industry.
Resumo:
Rainfall intensity durations relationships are extremely important in the design of systems for mitigating runoff losses. The objective of this work was to compare rainfall depths generated by the PLUVIO 2.1 software, with depths from the standard intensity duration curves developed by MARTINEZ & MAGNI (1999). It was compared rainfall intensities of 10, 20, 30, 60, 120 and 1440 minute durations for 2, 5, 10, 50 and 100 year return periods for 30 sites in the state of Sao Paulo. The results showed that PLUVIO was effective, except in predicting the 24 hours rainfall from 100 year return period events in four locations in the central and eastern regions of the state.
Resumo:
Surveillance Levels (SLs) are categories for medical patients (used in Brazil) that represent different types of medical recommendations. SLs are defined according to risk factors and the medical and developmental history of patients. Each SL is associated with specific educational and clinical measures. The objective of the present paper was to verify computer-aided, automatic assignment of SLs. The present paper proposes a computer-aided approach for automatic recommendation of SLs. The approach is based on the classification of information from patient electronic records. For this purpose, a software architecture composed of three layers was developed. The architecture is formed by a classification layer that includes a linguistic module and machine learning classification modules. The classification layer allows for the use of different classification methods, including the use of preprocessed, normalized language data drawn from the linguistic module. We report the verification and validation of the software architecture in a Brazilian pediatric healthcare institution. The results indicate that selection of attributes can have a great effect on the performance of the system. Nonetheless, our automatic recommendation of surveillance level can still benefit from improvements in processing procedures when the linguistic module is applied prior to classification. Results from our efforts can be applied to different types of medical systems. The results of systems supported by the framework presented in this paper may be used by healthcare and governmental institutions to improve healthcare services in terms of establishing preventive measures and alerting authorities about the possibility of an epidemic.
Resumo:
Organizational intelligence can be seen as a function of the viable structure of an organization. With the integration of the Viable System Model and Soft Systems Methodology (systemic approaches of organizational management) focused on the role of the intelligence function, it is possible to elaborate a model of action with a structured methodology to prospect, select, treat and distribute information to the entire organization that improves the efficacy and efficiency of all processes. This combination of methodologies is called Intelligence Systems Methodology (ISM) whose assumptions and dynamics are delimited in this paper. The ISM is composed of two simultaneous activities: the Active Environmental Mapping and the Stimulated Action Cycle. The elaboration of the formal ISM description opens opportunities for applications of the methodology on real situations, offering a new path for this specific issue of systems thinking: the intelligence systems. Knowledge Management Research & Practice (2012) 10, 141-152. doi:10.1057/kmrp.2011.44
Resumo:
We investigate the canonical equilibrium of systems with long-range forces in competition. These forces create a modulation in the interaction potential and modulated phases appear at the system scale. The structure of these phases differentiate this system from monotonic potentials, where only the mean-field and disordered phases exist. With increasing temperature, the system switches from one ordered phase to another through a first-order phase transition. Both mean-field and modulated phases may be stable, even at zero temperature, and the long-range nature of the interaction will lead to metastability characterized by extremely long time scales.
Resumo:
Purpose: Implant-abutment connections still present failures in the oral cavity due to the loosening of mechanical integrity by detorque and corrosion of the abutment screws. The objective of this study was to evaluate the detorque of dental abutment screws before and after immersion in fluoridated solutions. Materials and Methods: Five commercial implant-abutment assemblies were assessed in this investigation: (C) Conex˜aoR , (E) EmfilsR , (I) INPR , (S) SINR , and (T) Titanium FixR . The implants were embedded in an acrylic resin and then placed in a holding device. The abutments were first connected to the implants and torqued to 20Ncmusing a handheld torque meter. The detorque values of the abutments were evaluated after 10 minutes. After applying a second torque of 20 Ncm, implant-abutment assemblies were withdrawn every 3 hours for 12 hours in a fluoridated solution over a period of 90 days. After that period, detorque of the abutments was examined. Scanning electronicmicroscopy (SEM) associated to energy dispersive spectroscopy (EDS) was applied to inspect the surfaces of abutments. Results: Detorque values of systems C, E, and I immersed in the fluoridated solution were significantly higher than those of the initial detorque. ANOVA demonstrated no significant differences in detorque values between designs S and T. Signs of localized corrosion could not be detected by SEM although chemical analysis by EDS showed the presence of elements involved in corrosive processes. Conclusion: An increase of detorque values recorded on abutments after immersion in fluoridated artificial saliva solutions was noticed in this study. Regarding chemical analysis, such an increase of detorque can result from a corrosion layer formed between metallic surfaces at static contact in the implant-abutment joint during immersion in the fluoridated solutions.
Resumo:
Remanufacturing is the process of rebuilding used products that ensures that the quality of remanufactured products is equivalent to that of new ones. Although the theme is gaining ground, it is still little explored due to lack of knowledge, the difficulty of visualizing it systemically, and implementing it effectively. Few models treat remanufacturing as a system. Most of the studies still treated remanufacturing as an isolated process, preventing it from being seen in an integrated manner. Therefore, the aim of this work is to organize the knowledge about remanufacturing, offering a vision of remanufacturing system and contributing to an integrated view about the theme. The methodology employed was a literature review, adopting the General Theory of Systems to characterize the remanufacturing system. This work consolidates and organizes the elements of this system, enabling a better understanding of remanufacturing and assisting companies in adopting the concept.
Resumo:
Parallel kinematic structures are considered very adequate architectures for positioning and orienti ng the tools of robotic mechanisms. However, developing dynamic models for this kind of systems is sometimes a difficult task. In fact, the direct application of traditional methods of robotics, for modelling and analysing such systems, usually does not lead to efficient and systematic algorithms. This work addre sses this issue: to present a modular approach to generate the dynamic model and through some convenient modifications, how we can make these methods more applicable to parallel structures as well. Kane’s formulati on to obtain the dynamic equations is shown to be one of the easiest ways to deal with redundant coordinates and kinematic constraints, so that a suitable c hoice of a set of coordinates allows the remaining of the modelling procedure to be computer aided. The advantages of this approach are discussed in the modelling of a 3-dof parallel asymmetric mechanisms.
Resumo:
The University of São Paulo has been experiencing the increase in contents in electronic and digital formats, distributed by different suppliers and hosted remotely or in clouds, and is faced with the also increasing difficulties related to facilitating access to this digital collection by its users besides coexisting with the traditional world of physical collections. A possible solution was identified in the new generation of systems called Web Scale Discovery, which allow better management, data integration and agility of search. Aiming to identify if and how such a system would meet the USP demand and expectation and, in case it does, to identify what the analysis criteria of such a tool would be, an analytical study with an essentially documental base was structured, as from a revision of the literature and from data available in official websites and of libraries using this kind of resources. The conceptual base of the study was defined after the identification of software assessment methods already available, generating a standard with 40 analysis criteria, from details on the unique access interface to information contents, web 2.0 characteristics, intuitive interface, facet navigation, among others. The details of the studies conducted into four of the major systems currently available in this software category are presented, providing subsidies for the decision-making of other libraries interested in such systems.
Resumo:
We present a new catalogue of galaxy triplets derived from the Sloan Digital Sky Survey (SDSS) Data Release 7. The identification of systems was performed considering galaxies brighter than Mr=-20.5 and imposing constraints over the projected distances, radial velocity differences of neighbouring galaxies and isolation. To improve the identification of triplets, we employed a data pixelization scheme, which allows us to handle large amounts of data as in the SDSS photometric survey. Using spectroscopic and photometric data in the redshift range 0.01 =z= 0.40, we obtain 5901 triplet candidates. We have used a mock catalogue to analyse the completeness and contamination of our methods. The results show a high level of completeness ( 80 per cent) and low contamination ( 5 per cent). By using photometric and spectroscopic data, we have also addressed the effects of fibre collisions in the spectroscopic sample. We have defined an isolation criterion considering the distance of the triplet brightest galaxy to the closest neighbour cluster, to describe a global environment, as well as the galaxies within a fixed aperture, around the triplet brightest galaxy, to measure the local environment. The final catalogue comprises 1092 isolated triplets of galaxies in the redshift range 0.01 =z= 0.40. Our results show that photometric redshifts provide very useful information, allowing us to complete the sample of nearby systems whose detection is affected by fibre collisions, as well as extending the detection of triplets to large distances, where spectroscopic redshifts are not available.