911 resultados para Technicolor and Composite Models
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
The aim of the present study was to evaluate the heterosis effects on weaning weight at 205 days (WW, N = 146,464), yearling weight at 390 days (YW, N = 69,315) and weight gain from weaning to yearling (WG, N = 59,307) in composite beef cattle. The fixed models were: RM, which included contemporary groups, class of age of dam, outcrossing percentages for direct and maternal effects, and additive direct and maternal ( AM) breed effects; R, RM model, minus AM breed effects, and H, RM model, minus additive breed effects. The estimates for W205 were in general positive (P < 0.01). The R and H models resulted in similar estimates, but they were very different from the ones estimated by the RM model. For W390, the R and H models resulted in general positive estimates (P < 0.05). For WG, the RM model resulted in general significant heterosis effects (P < 0.05). It can be concluded that the RM model seems to supply estimates of better quality (P < 0.01).
Resumo:
The dynamic behavior of composite laminates is very complex because there are many concurrent phenomena during composite laminate failure under impact load. Fiber breakage, delaminations, matrix cracking, plastic deformations due to contact and large displacements are some effects which should be considered when a structure made from composite material is impacted by a foreign object. Thus, an investigation of the low velocity impact on laminated composite thin disks of epoxy resin reinforced by carbon fiber is presented. The influence of stacking sequence and energy impact was investigated using load-time histories, displacement-time histories and energy-time histories as well as images from NDE. Indentation tests results were compared to dynamic results, verifying the inertia effects when thin composite laminate was impacted by foreign object with low velocity. Finite element analysis (FEA) was developed, using Hill`s model and material models implemented by UMAT (User Material Subroutine) into software ABAQUS (TM), in order to simulate the failure mechanisms under indentation tests. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
A web service is a software system that provides a machine-processable interface to the other machines over the network using different Internet protocols. They are being increasingly used in the industry in order to automate different tasks and offer services to a wider audience. The REST architectural style aims at producing scalable and extensible web services using technologies that play well with the existing tools and infrastructure of the web. It provides a uniform set of operation that can be used to invoke a CRUD interface (create, retrieve, update and delete) of a web service. The stateless behavior of the service interface requires that every request to a resource is independent of the previous ones facilitating scalability. Automated systems, e.g., hotel reservation systems, provide advanced scenarios for stateful services that require a certain sequence of requests that must be followed in order to fulfill the service goals. Designing and developing such services for advanced scenarios with REST constraints require rigorous approaches that are capable of creating web services that can be trusted for their behavior. Systems that can be trusted for their behavior can be termed as dependable systems. This thesis presents an integrated design, analysis and validation approach that facilitates the service developer to create dependable and stateful REST web services. The main contribution of this thesis is that we provide a novel model-driven methodology to design behavioral REST web service interfaces and their compositions. The behavioral interfaces provide information on what methods can be invoked on a service and the pre- and post-conditions of these methods. The methodology uses Unified Modeling Language (UML), as the modeling language, which has a wide user base and has mature tools that are continuously evolving. We have used UML class diagram and UML state machine diagram with additional design constraints to provide resource and behavioral models, respectively, for designing REST web service interfaces. These service design models serve as a specification document and the information presented in them have manifold applications. The service design models also contain information about the time and domain requirements of the service that can help in requirement traceability which is an important part of our approach. Requirement traceability helps in capturing faults in the design models and other elements of software development environment by tracing back and forth the unfulfilled requirements of the service. The information about service actors is also included in the design models which is required for authenticating the service requests by authorized actors since not all types of users have access to all the resources. In addition, following our design approach, the service developer can ensure that the designed web service interfaces will be REST compliant. The second contribution of this thesis is consistency analysis of the behavioral REST interfaces. To overcome the inconsistency problem and design errors in our service models, we have used semantic technologies. The REST interfaces are represented in web ontology language, OWL2, that can be part of the semantic web. These interfaces are used with OWL 2 reasoners to check unsatisfiable concepts which result in implementations that fail. This work is fully automated thanks to the implemented translation tool and the existing OWL 2 reasoners. The third contribution of this thesis is the verification and validation of REST web services. We have used model checking techniques with UPPAAL model checker for this purpose. The timed automata of UML based service design models are generated with our transformation tool that are verified for their basic characteristics like deadlock freedom, liveness, reachability and safety. The implementation of a web service is tested using a black-box testing approach. Test cases are generated from the UPPAAL timed automata and using the online testing tool, UPPAAL TRON, the service implementation is validated at runtime against its specifications. Requirement traceability is also addressed in our validation approach with which we can see what service goals are met and trace back the unfulfilled service goals to detect the faults in the design models. A final contribution of the thesis is an implementation of behavioral REST interfaces and service monitors from the service design models. The partial code generation tool creates code skeletons of REST web services with method pre and post-conditions. The preconditions of methods constrain the user to invoke the stateful REST service under the right conditions and the post condition constraint the service developer to implement the right functionality. The details of the methods can be manually inserted by the developer as required. We do not target complete automation because we focus only on the interface aspects of the web service. The applicability of the approach is demonstrated with a pedagogical example of a hotel room booking service and a relatively complex worked example of holiday booking service taken from the industrial context. The former example presents a simple explanation of the approach and the later worked example shows how stateful and timed web services offering complex scenarios and involving other web services can be constructed using our approach.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The contributions of the concrete slab and composite action to the vertical shear strength of continuous steel-concrete composite beams are ignored in current design codes, which result in conservative designs. This paper investigates the ultimate strength of continuous composite beams in combined bending and shear by using the finite element analysis method. A three-dimensional finite element model has been developed to account for the geometric and material nonlinear behaviour of continuous composite beams. The finite element model is verified by experimental results and then used to study the effects of the concrete slab and shear connection on the vertical shear strength. The moment-shear interaction strength of continuous composite beams is also investigated by varying the moment/ shear ratio. It is shown that the concrete slab and composite action significantly increase the ultimate strength of continuous composite beams. Based on numerical results, design models are proposed for the vertical shear strength and moment-shear interaction of continuous composite beams. The proposed design models, which incorporates the effects of the concrete slab, composite action, stud pullout failure and web shear buckling, are compared with experimental results with good agreement. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
Despite experimental evidences, the contributions of the concrete slab and composite action to the vertical shear strength of simply supported steel-concrete composite beams are not considered in current design codes, which lead to conservative designs. In this paper, the finite element method is used to investigate the flexural and shear strengths of simply supported composite beams under combined bending and shear. A three-dimensional finite element model has been developed to account for geometric and material nonlinear behavior of composite beams, and verified by experimental results. The verified finite element model is than employed to quantify the contributions of the concrete slab and composite action to the moment and shear capacities of composite beams. The effect of the degree of shear connection on the vertical shear strength of deep composite beams loaded in shear is studied. Design models for vertical shear strength including contributions from the concrete slab and composite action and for the ultimate moment-shear interaction ate proposed for the design of simply supported composite beams in combined bending and shear. The proposed design models provide a consistent and economical design procedure for simply supported composite beams.
Resumo:
Investigations of the optical response of subwavelength-structure arrays milled into thin metal films have revealed surprising phenomena, including reports of unexpectedly high transmission of light. Many studies have interpreted the optical coupling to the surface in terms of the resonant excitation of surface plasmon polaritons (SPPs), but other approaches involving composite diffraction of surface evanescent waves (CDEW) have also been proposed. Here we present a series of measurements on very simple one-dimensional subwavelength structures to test the key properties of the surface waves, and compare them to the CDEW and SPP models. We find that the optical response of the silver metal surface proceeds in two steps: a diffractive perturbation in the immediate vicinity (2–3 mu m) of the structure, followed by excitation of a persistent surface wave that propagates over tens of micrometres. The measured wavelength and phase of this persistent wave are significantly shifted from those expected for resonance excitation of a conventional SPP on a pure silver surface.
Resumo:
Strawberries represent the main source of ellagic acid derivatives in the Brazilian diet, corresponding to more than 50% of all phenolic compounds found in the fruit. There is a particular interest in the determination of the ellagic acid content in fruits because of possible chemopreventive benefits. In the present study, the potential health benefits of purified ellagitannins from strawberries were evaluated in relation to the antiproliferative activity and in vitro inhibition of alpha-amylase, alpha-glucosidase, and angiotensin I-converting enzyme (ACE) relevant for potential management of hyperglycemia and hypertension. Therefore, a comparison among ellagic acid, purified ellagitannins, and a strawberry extract was done to evaluate the possible synergistic effects of phenolics. In relation to the antiproliferative activity, it was observed that ellagic acid had the highest percentage inhibition of cell proliferation. The strawberry extract had lower efficacy in inhibiting the cell proliferation, indicating that in the case of this fruit there is no synergism. Purified ellagitannins had high alpha-amylase and ACE inhibitory activities. However, these compounds had low alpha-glucosidase inhibitory activity. These results suggested that the ellagitannins and ellagic acid have good potential for the management of hyperglycemia and hypertension linked to type 2 diabetes. However, further studies with animal and human models are needed to advance the in vitro assay-based biochemical rationale from this study.
Resumo:
Smooth copper coatings containing well-distributed silicon nitride particles were obtained by co-electrodeposition in acidic sulfate bath. The cathodic current density did not show significant influence on incorporated particle volume fraction, whereas the increase of particle concentration in the bath led to its decrease. The increase of stirring rate increased the amount of embedded particles. The microhardness of the composite layers was higher than that of pure copper deposits obtained under the same conditions due to dispersion-strengthening and copper matrix grain refinement and increased with the increase of incorporated particle volume fraction. The microhardness of composites also increased with the increase of current density due to copper matrix grain refining. The composite coatings presented higher strength but lower ductility than pure copper layers. Pure copper and composite coatings showed the same corrosion resistance in 0.5 wt.% H(2)SO(4) solution at room temperature. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
The knowledge of thermochemical parameters such as the enthalpy of formation, gas-phase basicity, and proton affinity may be the key to understanding molecular reactivity. The obtention of these thermochemical parameters by theoretical chemical models may be advantageous when experimental measurements are difficult to accomplish. The development of ab initio composite models represents a major advance in the obtention of these thermochemical parameters,. but these methods do not always lead to accurate values. Aiming at achieving a comparison between the ab initio models and the hybrid models based on the density functional theory (DFT), we have studied gamma-butyrolactone and 2-pyrrolidinone with a goal of obtaining high-quality thermochemical parameters using the composite chemical models G2, G2MP2, MP2, G3, CBS-Q, CBS-4, and CBS-QB3; the DFT methods B3LYP, B3P86, PW91PW91, mPW1PW, and B98; and the basis sets 6-31G(d), 6-31+G(d), 6-31G(d,p), 6-31+G(d,p), 6-31++G(d,p), 6-311G(d), 6-311+G(d), 6-311G(d,p), 6-311+G(d,p), 6-311++G(d,p), aug-cc-pVDZ, and aug-cc-pVTZ. Values obtained for the enthalpies of formation, proton affinity, and gas-phase basicity of the two target molecules were compared to the experimental data reported in the literature. The best results were achieved with the use of DFT models, and the B3LYP method led to the most accurate data.
Resumo:
An important consideration in the development of mathematical models for dynamic simulation, is the identification of the appropriate mathematical structure. By building models with an efficient structure which is devoid of redundancy, it is possible to create simple, accurate and functional models. This leads not only to efficient simulation, but to a deeper understanding of the important dynamic relationships within the process. In this paper, a method is proposed for systematic model development for startup and shutdown simulation which is based on the identification of the essential process structure. The key tool in this analysis is the method of nonlinear perturbations for structural identification and model reduction. Starting from a detailed mathematical process description both singular and regular structural perturbations are detected. These techniques are then used to give insight into the system structure and where appropriate to eliminate superfluous model equations or reduce them to other forms. This process retains the ability to interpret the reduced order model in terms of the physico-chemical phenomena. Using this model reduction technique it is possible to attribute observable dynamics to particular unit operations within the process. This relationship then highlights the unit operations which must be accurately modelled in order to develop a robust plant model. The technique generates detailed insight into the dynamic structure of the models providing a basis for system re-design and dynamic analysis. The technique is illustrated on the modelling for an evaporator startup. Copyright (C) 1996 Elsevier Science Ltd
Resumo:
Figures on the relative frequency of synthetic and composite future forms in Ouest-France are presented and compared with those of earlier studies on the passé simple and passé composé. The synthetic future is found to be dominant. Possible formal explanations for distribution are found to be inconclusive. Distribution across different text-types is found to be more promising, since contrastive functions of the two forms can be identified in texts where they co-occur. The composite future typically reports new proposals or plans as current news, while the synthetic future outlines details that will be realised at the time of implementation. Both functions are important in dailies, but current news is more often expressed in the present tense at the expense of the composite future.
Resumo:
A growing number of predicting corporate failure models has emerged since 60s. Economic and social consequences of business failure can be dramatic, thus it is not surprise that the issue has been of growing interest in academic research as well as in business context. The main purpose of this study is to compare the predictive ability of five developed models based on three statistical techniques (Discriminant Analysis, Logit and Probit) and two models based on Artificial Intelligence (Neural Networks and Rough Sets). The five models were employed to a dataset of 420 non-bankrupt firms and 125 bankrupt firms belonging to the textile and clothing industry, over the period 2003–09. Results show that all the models performed well, with an overall correct classification level higher than 90%, and a type II error always less than 2%. The type I error increases as we move away from the year prior to failure. Our models contribute to the discussion of corporate financial distress causes. Moreover it can be used to assist decisions of creditors, investors and auditors. Additionally, this research can be of great contribution to devisers of national economic policies that aim to reduce industrial unemployment.
Resumo:
A growing number of predicting corporate failure models has emerged since 60s. Economic and social consequences of business failure can be dramatic, thus it is not surprise that the issue has been of growing interest in academic research as well as in business context. The main purpose of this study is to compare the predictive ability of five developed models based on three statistical techniques (Discriminant Analysis, Logit and Probit) and two models based on Artificial Intelligence (Neural Networks and Rough Sets). The five models were employed to a dataset of 420 non-bankrupt firms and 125 bankrupt firms belonging to the textile and clothing industry, over the period 2003–09. Results show that all the models performed well, with an overall correct classification level higher than 90%, and a type II error always less than 2%. The type I error increases as we move away from the year prior to failure. Our models contribute to the discussion of corporate financial distress causes. Moreover it can be used to assist decisions of creditors, investors and auditors. Additionally, this research can be of great contribution to devisers of national economic policies that aim to reduce industrial unemployment.