987 resultados para Input Technology
Resumo:
Many authors point out that the front-end of new product development (NPD) is a critical success factor in the NPD process and that numerous companies face difficulties in carrying it out appropriately. Therefore, it is important to develop new theories and proposals that support the effective implementation of this earliest phase of NPD. This paper presents a new method to support the development of front-end activities based on integrating technology roadmapping (TRM) and project portfolio management (PPM). This new method, called the ITP Method, was implemented at a small Brazilian high-tech company in the nanotechnology industry to explore the integration proposal. The case study demonstrated that the ITP Method provides a systematic procedure for the fuzzy front-end and integrates innovation perspectives into a single roadmap, which allows for a better alignment of business efforts and communication of product innovation goals. Furthermore, the results indicated that the method may also improve quality, functional integration and strategy alignment. (C) 2010 Elsevier Inc. All rights reserved.
Resumo:
This paper presents a proposal for a Quality Management System for a generic GNSS Surveying Company as an alternative for management and service quality improvements. As a result of the increased demand for GNSS measurements, a large number of new or restructured companies were established to operate in that market. Considering that GNSS surveying is a new process, some changes must be performed in order to accommodate the old surveying techniques and the old fashioned management to the new reality. This requires a new management model that must be based on a well-described procedure sequence aiming at the Total Management Quality for the company. The proposed Quality Management System was based on the requirements of the Quality System ISO 9000:2000, applied to the whole company, focusing on the productive process of GNSS surveying work.
Resumo:
Considering the increasing popularity of network-based control systems and the huge adoption of IP networks (such as the Internet), this paper studies the influence of network quality of service (QoS) parameters over quality of control parameters. An example of a control loop is implemented using two LonWorks networks (CEA-709.1) interconnected by an emulated IP network, in which important QoS parameters such as delay and delay jitter can be completely controlled. Mathematical definitions are provided according to the literature, and the results of the network-based control loop experiment are presented and discussed.
Resumo:
This work presents a case study on technology assessment for power quality improvement devices. A system compatibility test protocol for power quality mitigation devices was developed in order to evaluate the functionality of three-phase voltage restoration devices. In order to validate this test protocol, the micro-DVR, a reduced power development platform for DVR (dynamic voltage restorer) devices, was tested and the results are discussed based on voltage disturbances standards. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
There are several ways to attempt to model a building and its heat gains from external sources as well as internal ones in order to evaluate a proper operation, audit retrofit actions, and forecast energy consumption. Different techniques, varying from simple regression to models that are based on physical principles, can be used for simulation. A frequent hypothesis for all these models is that the input variables should be based on realistic data when they are available, otherwise the evaluation of energy consumption might be highly under or over estimated. In this paper, a comparison is made between a simple model based on artificial neural network (ANN) and a model that is based on physical principles (EnergyPlus) as an auditing and predicting tool in order to forecast building energy consumption. The Administration Building of the University of Sao Paulo is used as a case study. The building energy consumption profiles are collected as well as the campus meteorological data. Results show that both models are suitable for energy consumption forecast. Additionally, a parametric analysis is carried out for the considered building on EnergyPlus in order to evaluate the influence of several parameters such as the building profile occupation and weather data on such forecasting. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
A model predictive controller (MPC) is proposed, which is robustly stable for some classes of model uncertainty and to unknown disturbances. It is considered as the case of open-loop stable systems, where only the inputs and controlled outputs are measured. It is assumed that the controller will work in a scenario where target tracking is also required. Here, it is extended to the nominal infinite horizon MPC with output feedback. The method considers an extended cost function that can be made globally convergent for any finite input horizon considered for the uncertain system. The method is based on the explicit inclusion of cost contracting constraints in the control problem. The controller considers the output feedback case through a non-minimal state-space model that is built using past output measurements and past input increments. The application of the robust output feedback MPC is illustrated through the simulation of a low-order multivariable system.
Resumo:
A green ceramic tape micro-heat exchanger was developed using Low Temperature Co-fired Ceramics technology (LTCC). The device was designed by using Computational Aided Design software and simulations were made using a Computational Fluid Dynamics package (COMSOL Multiphysics) to evaluate the homogeneity of fluid distribution in the microchannels. Four geometries were proposed and simulated in two and three dimensions to show that geometric details directly affect the distribution of velocity in the micro-heat exchanger channels. The simulation results were quite useful for the design of the microfluidic device. The micro-heat exchanger was then constructed using the LTCC technology and is composed of five thermal exchange plates in cross-flow arrangement and two connecting plates, with all plates stacked to form a device with external dimensions of 26 x 26 x 6 mm(3).
Resumo:
Model predictive control (MPC) is usually implemented as a control strategy where the system outputs are controlled within specified zones, instead of fixed set points. One strategy to implement the zone control is by means of the selection of different weights for the output error in the control cost function. A disadvantage of this approach is that closed-loop stability cannot be guaranteed, as a different linear controller may be activated at each time step. A way to implement a stable zone control is by means of the use of an infinite horizon cost in which the set point is an additional variable of the control problem. In this case, the set point is restricted to remain inside the output zone and an appropriate output slack variable is included in the optimisation problem to assure the recursive feasibility of the control optimisation problem. Following this approach, a robust MPC is developed for the case of multi-model uncertainty of open-loop stable systems. The controller is devoted to maintain the outputs within their corresponding feasible zone, while reaching the desired optimal input target. Simulation of a process of the oil re. ning industry illustrates the performance of the proposed strategy.
Resumo:
Modern Integrated Circuit (IC) design is characterized by a strong trend of Intellectual Property (IP) core integration into complex system-on-chip (SOC) architectures. These cores require thorough verification of their functionality to avoid erroneous behavior in the final device. Formal verification methods are capable of detecting any design bug. However, due to state explosion, their use remains limited to small circuits. Alternatively, simulation-based verification can explore hardware descriptions of any size, although the corresponding stimulus generation, as well as functional coverage definition, must be carefully planned to guarantee its efficacy. In general, static input space optimization methodologies have shown better efficiency and results than, for instance, Coverage Directed Verification (CDV) techniques, although they act on different facets of the monitored system and are not exclusive. This work presents a constrained-random simulation-based functional verification methodology where, on the basis of the Parameter Domains (PD) formalism, irrelevant and invalid test case scenarios are removed from the input space. To this purpose, a tool to automatically generate PD-based stimuli sources was developed. Additionally, we have developed a second tool to generate functional coverage models that fit exactly to the PD-based input space. Both the input stimuli and coverage model enhancements, resulted in a notable testbench efficiency increase, if compared to testbenches with traditional stimulation and coverage scenarios: 22% simulation time reduction when generating stimuli with our PD-based stimuli sources (still with a conventional coverage model), and 56% simulation time reduction when combining our stimuli sources with their corresponding, automatically generated, coverage models.
Resumo:
Riparian forests are important for the structure and functioning of stream ecosystems, providing structural components such as large woody debris (LWD). Changes in these forests will cause modifications in the LWD input to streams, affecting their structure. In order to assess the influence of riparian forests changes in LWD supply, 15 catchments (third and fourth order) with riparian forests at different conservation levels were selected for sampling. In each catchment we quantified the abundance, volume and diameter of LWD in stream channels; the number, area and volume of pools formed by LWD and basal area and tree diameter of riparian forest. We found that riparian forests were at a secondary successional stage with predominantly young trees (diameter at breast height < 10 cm) in all studied streams. Results showed that basal area and diameter of riparian forest differed between the stream groups (forested and non-forested), but tree density did not differ between groups. Differences were also observed in LWD abundance, volume, frequency of LWD pools with subunits and area and volume of LWD pools. LWD diameter, LWD that form pools diameter and frequency of LWD pools without subunits did not differ between stream groups. Regression analyses showed that LWD abundance and volume, and frequency of LWD pools (with and without subunits) were positively related with the proportion of riparian forest. LWD diameter was not correlated to riparian tree diameter. The frequency of LWD pools was correlated to the abundance and volume of LWD, but characteristics of these pools (area and volume) were not correlated to the diameter of LWD that formed the pools. These results show that alterations in riparian forest cause modifications in the LWD abundance and volume in the stream channel, affecting mainly the structural complexity of these ecosystems (reduction in the number and structural characteristics of LWD pools). Our results also demonstrate that riparian forest conservation actions must consider not only its extension, but also successional stage to guarantee the quantity and quality of LWD necessary to enable the structuring of stream channels.
Resumo:
The application of airborne laser scanning (ALS) technologies in forest inventories has shown great potential to improve the efficiency of forest planning activities. Precise estimates, fast assessment and relatively low complexity can explain the good results in terms of efficiency. The evolution of GPS and inertial measurement technologies, as well as the observed lower assessment costs when these technologies are applied to large scale studies, can explain the increasing dissemination of ALS technologies. The observed good quality of results can be expressed by estimates of volumes and basal area with estimated error below the level of 8.4%, depending on the size of sampled area, the quantity of laser pulses per square meter and the number of control plots. This paper analyzes the potential of an ALS assessment to produce certain forest inventory statistics in plantations of cloned Eucalyptus spp with precision equal of superior to conventional methods. The statistics of interest in this case were: volume, basal area, mean height and dominant trees mean height. The ALS flight for data assessment covered two strips of approximately 2 by 20 Km, in which clouds of points were sampled in circular plots with a radius of 13 m. Plots were sampled in different parts of the strips to cover different stand ages. The clouds of points generated by the ALS assessment: overall height mean, standard error, five percentiles (height under which we can find 10%, 30%, 50%,70% and 90% of the ALS points above ground level in the cloud), and density of points above ground level in each percentile were calculated. The ALS statistics were used in regression models to estimate mean diameter, mean height, mean height of dominant trees, basal area and volume. Conventional forest inventory sample plots provided real data. For volume, an exploratory assessment involving different combinations of ALS statistics allowed for the definition of the most promising relationships and fitting tests based on well known forest biometric models. The models based on ALS statistics that produced the best results involved: the 30% percentile to estimate mean diameter (R(2)=0,88 and MQE%=0,0004); the 10% and 90% percentiles to estimate mean height (R(2)=0,94 and MQE%=0,0003); the 90% percentile to estimate dominant height (R(2)=0,96 and MQE%=0,0003); the 10% percentile and mean height of ALS points to estimate basal area (R(2)=0,92 and MQE%=0,0016); and, to estimate volume, age and the 30% and 90% percentiles (R(2)=0,95 MQE%=0,002). Among the tested forest biometric models, the best fits were provided by the modified Schumacher using age and the 90% percentile, modified Clutter using age, mean height of ALS points and the 70% percentile, and modified Buckman using age, mean height of ALS points and the 10% percentile.
Resumo:
Age-related changes in the adult language addressed to children aged 2;0-4;0 years in polyadic conditions were investigated in Australian childcare centres. The language that 21 staff members addressed to these children was coded for multiple variables in the broad social categories of prosody, context, speech act and gesture. The linguistic components were coded within the categories of phonology, lexicon, morphology, syntax and referential deixis. Minimal age-related differences were found. Explanations for the similarity of the adult language input across the age groups within the early childhood educational environment, will be discussed