28 resultados para Pricing Strategy
Resumo:
In practice the robotic manipulators present some degree of unwanted vibrations. The advent of lightweight arm manipulators, mainly in the aerospace industry, where weight is an important issue, leads to the problem of intense vibrations. On the other hand, robots interacting with the environment often generate impacts that propagate through the mechanical structure and produce also vibrations. In order to analyze these phenomena a robot signal acquisition system was developed. The manipulator motion produces vibrations, either from the structural modes or from endeffector impacts. The instrumentation system acquires signals from several sensors that capture the joint positions, mass accelerations, forces and moments, and electrical currents in the motors. Afterwards, an analysis package, running off-line, reads the data recorded by the acquisition system and extracts the signal characteristics. Due to the multiplicity of sensors, the data obtained can be redundant because the same type of information may be seen by two or more sensors. Because of the price of the sensors, this aspect can be considered in order to reduce the cost of the system. On the other hand, the placement of the sensors is an important issue in order to obtain the suitable signals of the vibration phenomenon. Moreover, the study of these issues can help in the design optimization of the acquisition system. In this line of thought a sensor classification scheme is presented. Several authors have addressed the subject of the sensor classification scheme. White (White, 1987) presents a flexible and comprehensive categorizing scheme that is useful for describing and comparing sensors. The author organizes the sensors according to several aspects: measurands, technological aspects, detection means, conversion phenomena, sensor materials and fields of application. Michahelles and Schiele (Michahelles & Schiele, 2003) systematize the use of sensor technology. They identified several dimensions of sensing that represent the sensing goals for physical interaction. A conceptual framework is introduced that allows categorizing existing sensors and evaluates their utility in various applications. This framework not only guides application designers for choosing meaningful sensor subsets, but also can inspire new systems and leads to the evaluation of existing applications. Today’s technology offers a wide variety of sensors. In order to use all the data from the diversity of sensors a framework of integration is needed. Sensor fusion, fuzzy logic, and neural networks are often mentioned when dealing with problem of combing information from several sensors to get a more general picture of a given situation. The study of data fusion has been receiving considerable attention (Esteban et al., 2005; Luo & Kay, 1990). A survey of the state of the art in sensor fusion for robotics can be found in (Hackett & Shah, 1990). Henderson and Shilcrat (Henderson & Shilcrat, 1984) introduced the concept of logic sensor that defines an abstract specification of the sensors to integrate in a multisensor system. The recent developments of micro electro mechanical sensors (MEMS) with unwired communication capabilities allow a sensor network with interesting capacity. This technology was applied in several applications (Arampatzis & Manesis, 2005), including robotics. Cheekiralla and Engels (Cheekiralla & Engels, 2005) propose a classification of the unwired sensor networks according to its functionalities and properties. This paper presents a development of a sensor classification scheme based on the frequency spectrum of the signals and on a statistical metrics. Bearing these ideas in mind, this paper is organized as follows. Section 2 describes briefly the robotic system enhanced with the instrumentation setup. Section 3 presents the experimental results. Finally, section 4 draws the main conclusions and points out future work.
Resumo:
This paper analyzes the signals captured during impacts and vibrations of a mechanical manipulator. To test the impacts, a flexible beam is clamped to the end-effector of a manipulator that is programmed in a way such that the rod moves against a rigid surface. Eighteen signals are captured and theirs correlation are calculated. A sensor classification scheme based on the multidimensional scaling technique is presented.
Resumo:
Solving systems of nonlinear equations is a problem of particular importance since they emerge through the mathematical modeling of real problems that arise naturally in many branches of engineering and in the physical sciences. The problem can be naturally reformulated as a global optimization problem. In this paper, we show that a metaheuristic, called Directed Tabu Search (DTS) [16], is able to converge to the solutions of a set of problems for which the fsolve function of MATLAB® failed to converge. We also show the effect of the dimension of the problem in the performance of the DTS.
Resumo:
A good verification strategy should bring near the simulation and real functioning environments. In this paper we describe a system-level co-verification strategy that uses a common flow for functional simulation, timing simulation and functional debug. This last step requires using a BST infrastructure, now widely available on commercial devices, specially on FPGAs with medium/large pin-counts.
Resumo:
The dynamic of the international business and its multidimensional nature requires the understanding of the complexities of different contexts dictated by cultural differences between countries. The purpose of this paper is to study, in depth howsmall and medium-sized companies develop their international marketing mix strategy in their overseas subsidiaries. We use the theoretical construct of Hofstede (1980, 1991) in the dimensions of Power Distance (PD), Uncertainty Avoidance (UA), Individualism (IND), Masculinity (MASC) and Long-Term Orientation (LTO) to assess the cross cultural differences between countries and the business practices to analyze the adaptation or standardization of the international marketing mix strategy of foreign Portuguese subsidiaries. Ourstudy uses an exploratoryand qualitative methodology. We conducted semi-structured interviews in order to achieve a good understanding ofinternational marketing mix strategy of four companies from different sectors. Our results show that the national cultural differences have great influence in the marketing strategy of the subsidiary. The business practices adjustments in the subsidiary company that proved to be necessary conditions for their performance are conducted by the products augmented offerings concerning the characteristics of the product, design and brand name in order to meet the requirements and specificities of the host country of the subsidiary.
Resumo:
According to the hedonic price method, a price of a good is related with the characteristics or the services it provides. Within this framework, the aim of this study it is to examine the effect on room rates of different characteristics of hotels in and around the city of Porto, such as star category, size, room and service quality, hotel facilities and location. It was estimated a hedonic price function, using data for 51 hotels. The results enable to identify the attributes that are important to consumers and hoteliers and to which extent. This information can be used by hotel managers to define a price strategy and helpful in new investment decisions.
Resumo:
The use of demand response programs enables the adequate use of resources of small and medium players, bringing high benefits to the smart grid, and increasing its efficiency. One of the difficulties to proceed with this paradigm is the lack of intelligence in the management of small and medium size players. In order to make demand response programs a feasible solution, it is essential that small and medium players have an efficient energy management and a fair optimization mechanism to decrease the consumption without heavy loss of comfort, making it acceptable for the users. This paper addresses the application of real-time pricing in a house that uses an intelligent optimization module involving artificial neural networks.
Resumo:
The use of renewables have been increased I several countries around the world, namely in Europe. The wind power is generally the larger renewable resource with very specific characteristics in what concerns its variability and the inherent impacts in the power systems and electricity markets operation. This paper focuses on the Portuguese context of renewables use, including wind power. The work here presented includes the use of a real time pricing methodology developed by the authors aiming the reduction of electricity consumption in the moments of unexpected low wind power. A more specific example of application of real time pricing is demonstrated for the minimization of the operation costs in a distribution network. When facing lower wind power generation than expected from day ahead forecast, demand response is used in order to minimize the impacts of such wind availability change. In this way, consumers actively participate in regulation up and spinning reserve ancillary services through demand response programs.
Resumo:
Recent changes of paradigm in power systems opened the opportunity to the active participation of new players. The small and medium players gain new opportunities while participating in demand response programs. This paper explores the optimal resources scheduling in two distinct levels. First, the network operator facing large wind power variations makes use of real time pricing to induce consumers to meet wind power variations. Then, at the consumer level, each load is managed according to the consumer preferences. The two-level resources schedule has been implemented in a real-time simulation platform, which uses hardware for consumer’ loads control. The illustrative example includes a situation of large lack of wind power and focuses on a consumer with 18 loads.
Resumo:
A backside protein-surface imprinting process is presented herein as a novel way to generate specific synthetic antibody materials. The template is covalently bonded to a carboxylated-PVC supporting film previously cast on gold, let to interact with charged monomers and surrounded next by another thick polymer. This polymer is then covalently attached to a transducing element and the backside of this structure (supporting film plus template) is removed as a regular “tape”. The new sensing layer is exposed after the full template removal, showing a high density of re-binding positions, as evidenced by SEM. To ensure that the templates have been efficiently removed, this re-binding layer was cleaned further with a proteolytic enzyme and solution washout. The final material was named MAPS, as in the back-side reading of SPAM, because it acts as a back-side imprinting of this recent approach. It was able to generate, for the first time, a specific response to a complex biomolecule from a synthetic material. Non-imprinted materials (NIMs) were also produced as blank and were used as a control of the imprinting process. All chemical modifications were followed by electrochemical techniques. This was done on a supporting film and transducing element of both MAPS and NIM. Only the MAPS-based device responded to oxLDL and the sensing layer was insensitive to other serum proteins, such as myoglobin and haemoglobin. Linear behaviour between log(C, μg mL−1) versus charged tranfer resistance (RCT, Ω) was observed by electrochemical impedance spectroscopy (EIS). Calibrations made in Fetal Calf Serum (FCS) were linear from 2.5 to 12.5 μg mL−1 (RCT = 946.12 × log C + 1590.7) with an R-squared of 0.9966. Overall, these were promising results towards the design of materials acting close to the natural antibodies and applied to practical use of clinical interest.
Resumo:
Stroke is one of the most common conditions requiring rehabilitation, and its motor impairments are a major cause of permanent disability. Hemiparesis is observed by 80% of the patients after acute stroke. Neuroimaging studies showed that real and imagined movements have similarities regarding brain activation, supplying evidence that those similarities are based on the same process. Within this context, the combination of mental practice (MP) with physical and occupational therapy appears to be a natural complement based on neurorehabilitation concepts. Our study seeks to investigate if MP for stroke rehabilitation of upper limbs is an effective adjunct therapy. PubMed (Medline), ISI knowledge (Institute for Scientific Information) and SciELO (Scientific Electronic Library) were terminated on 20 February 2015. Data were collected on variables as follows: sample size, type of supervision, configuration of mental practice, setting the physical practice (intensity, number of sets and repetitions, duration of contractions, rest interval between sets, weekly and total duration), measures of sensorimotor deficits used in the main studies and significant results. Random effects models were used that take into account the variance within and between studies. Seven articles were selected. As there was no statistically significant difference between the two groups (MP vs control), showed a - 0.6 (95% CI: -1.27 to 0.04), for upper limb motor restoration after stroke. The present meta-analysis concluded that MP is not effective as adjunct therapeutic strategy for upper limb motor restoration after stroke.
Resumo:
Cloud data centers have been progressively adopted in different scenarios, as reflected in the execution of heterogeneous applications with diverse workloads and diverse quality of service (QoS) requirements. Virtual machine (VM) technology eases resource management in physical servers and helps cloud providers achieve goals such as optimization of energy consumption. However, the performance of an application running inside a VM is not guaranteed due to the interference among co-hosted workloads sharing the same physical resources. Moreover, the different types of co-hosted applications with diverse QoS requirements as well as the dynamic behavior of the cloud makes efficient provisioning of resources even more difficult and a challenging problem in cloud data centers. In this paper, we address the problem of resource allocation within a data center that runs different types of application workloads, particularly CPU- and network-intensive applications. To address these challenges, we propose an interference- and power-aware management mechanism that combines a performance deviation estimator and a scheduling algorithm to guide the resource allocation in virtualized environments. We conduct simulations by injecting synthetic workloads whose characteristics follow the last version of the Google Cloud tracelogs. The results indicate that our performance-enforcing strategy is able to fulfill contracted SLAs of real-world environments while reducing energy costs by as much as 21%.
Resumo:
Innovation is recognized by academics and practitioners as an essential competitive enabler for any company to survive, to remain competitive and to grow. Investments in tasks of R&D have not always brought the expected results. But that doesn't mean that the outcomes would not be useful to other companies of the same business area or even from another area. Thus, there is much knowledge already available in the market that can be helpful to some and profitable to others. So, the ideas and expertise can be found outside a company's boundaries and also exported from within. Information, knowledge, experience, wisdom is already available in the millions of the human beings of this planet, the challenge is to use them through a network to produce new ideas and tips that can be useful to a company with less costs. This was the reason for the emergence of the area of crowdsourcing innovation. Crowdsourcing innovation is a way of using the Web 2.0 tools to generate new ideas through the heterogeneous knowledge available in the global network of individuals highly qualified and with easy access to information and technology. So, a crowdsourcing innovation broker is an organization that mediates the communication and relationship between the seekers - companies that aspire to solve some problem or to take advantage of any business opportunity - with a crowd that is prone to give ideas based on their knowledge, experience and wisdom. This paper makes a literature review on models of open innovation, crowdsourcing innovation, and technology and knowledge intermediaries, and discusses this new phenomenon as a way to leverage the innovation capacity of enterprises. Finally, the paper outlines a research design agendafor explaining crowdsourcing innovation brokering phenomenon, exploiting its players, main functions, value creation process, and knowledge creation in order to define a knowledge metamodel of such intermediaries.