27 resultados para Blue Ocean strategy
Resumo:
This paper focuses on the scheduling of tasks with hard and soft real-time constraints in open and dynamic real-time systems. It starts by presenting a capacity sharing and stealing (CSS) strategy that supports the coexistence of guaranteed and non-guaranteed bandwidth servers to efficiently handle soft-tasks’ overloads by making additional capacity available from two sources: (i) reclaiming unused reserved capacity when jobs complete in less than their budgeted execution time and (ii) stealing reserved capacity from inactive non-isolated servers used to schedule best-effort jobs. CSS is then combined with the concept of bandwidth inheritance to efficiently exchange reserved bandwidth among sets of inter-dependent tasks which share resources and exhibit precedence constraints, assuming no previous information on critical sections and computation times is available. The proposed Capacity Exchange Protocol (CXP) has a better performance and a lower overhead when compared against other available solutions and introduces a novel approach to integrate precedence constraints among tasks of open real-time systems.
Resumo:
Dye-sensitized solar cell (DSSC) is a promising solution to global energy and environmental problems because of its clean, low-cost, high efficiency, good durability, and easy fabrication. However, enhancing the efficiency of the DSSC still is an important issue. Here we devise a bifacial DSSC based on a transparent polyaniline (PANI) counter electrode (CE). Owing to the sunlight irradiation simultaneously from the front and the rear sides, more dye molecules are excited and more carriers are generated, which results in the enhancement of short-circuit current density and therefore overall conversion efficiency. The photoelectric properties of PANI can be improved by modifying with 4-aminothiophenol (4-ATP). The bifacial DSSC with 4-ATP/PANI CE achieves a light-to-electric energy conversion efficiency of 8.35%, which is increased by ,24.6% compared to the DSSC irradiated from the front only. This new concept along with promising results provides a new approach for enhancing the photovoltaic performances of solar cells.
Resumo:
In practice the robotic manipulators present some degree of unwanted vibrations. The advent of lightweight arm manipulators, mainly in the aerospace industry, where weight is an important issue, leads to the problem of intense vibrations. On the other hand, robots interacting with the environment often generate impacts that propagate through the mechanical structure and produce also vibrations. In order to analyze these phenomena a robot signal acquisition system was developed. The manipulator motion produces vibrations, either from the structural modes or from endeffector impacts. The instrumentation system acquires signals from several sensors that capture the joint positions, mass accelerations, forces and moments, and electrical currents in the motors. Afterwards, an analysis package, running off-line, reads the data recorded by the acquisition system and extracts the signal characteristics. Due to the multiplicity of sensors, the data obtained can be redundant because the same type of information may be seen by two or more sensors. Because of the price of the sensors, this aspect can be considered in order to reduce the cost of the system. On the other hand, the placement of the sensors is an important issue in order to obtain the suitable signals of the vibration phenomenon. Moreover, the study of these issues can help in the design optimization of the acquisition system. In this line of thought a sensor classification scheme is presented. Several authors have addressed the subject of the sensor classification scheme. White (White, 1987) presents a flexible and comprehensive categorizing scheme that is useful for describing and comparing sensors. The author organizes the sensors according to several aspects: measurands, technological aspects, detection means, conversion phenomena, sensor materials and fields of application. Michahelles and Schiele (Michahelles & Schiele, 2003) systematize the use of sensor technology. They identified several dimensions of sensing that represent the sensing goals for physical interaction. A conceptual framework is introduced that allows categorizing existing sensors and evaluates their utility in various applications. This framework not only guides application designers for choosing meaningful sensor subsets, but also can inspire new systems and leads to the evaluation of existing applications. Today’s technology offers a wide variety of sensors. In order to use all the data from the diversity of sensors a framework of integration is needed. Sensor fusion, fuzzy logic, and neural networks are often mentioned when dealing with problem of combing information from several sensors to get a more general picture of a given situation. The study of data fusion has been receiving considerable attention (Esteban et al., 2005; Luo & Kay, 1990). A survey of the state of the art in sensor fusion for robotics can be found in (Hackett & Shah, 1990). Henderson and Shilcrat (Henderson & Shilcrat, 1984) introduced the concept of logic sensor that defines an abstract specification of the sensors to integrate in a multisensor system. The recent developments of micro electro mechanical sensors (MEMS) with unwired communication capabilities allow a sensor network with interesting capacity. This technology was applied in several applications (Arampatzis & Manesis, 2005), including robotics. Cheekiralla and Engels (Cheekiralla & Engels, 2005) propose a classification of the unwired sensor networks according to its functionalities and properties. This paper presents a development of a sensor classification scheme based on the frequency spectrum of the signals and on a statistical metrics. Bearing these ideas in mind, this paper is organized as follows. Section 2 describes briefly the robotic system enhanced with the instrumentation setup. Section 3 presents the experimental results. Finally, section 4 draws the main conclusions and points out future work.
Resumo:
This paper analyzes the signals captured during impacts and vibrations of a mechanical manipulator. To test the impacts, a flexible beam is clamped to the end-effector of a manipulator that is programmed in a way such that the rod moves against a rigid surface. Eighteen signals are captured and theirs correlation are calculated. A sensor classification scheme based on the multidimensional scaling technique is presented.
Resumo:
Solving systems of nonlinear equations is a problem of particular importance since they emerge through the mathematical modeling of real problems that arise naturally in many branches of engineering and in the physical sciences. The problem can be naturally reformulated as a global optimization problem. In this paper, we show that a metaheuristic, called Directed Tabu Search (DTS) [16], is able to converge to the solutions of a set of problems for which the fsolve function of MATLAB® failed to converge. We also show the effect of the dimension of the problem in the performance of the DTS.
Resumo:
A good verification strategy should bring near the simulation and real functioning environments. In this paper we describe a system-level co-verification strategy that uses a common flow for functional simulation, timing simulation and functional debug. This last step requires using a BST infrastructure, now widely available on commercial devices, specially on FPGAs with medium/large pin-counts.
Resumo:
The dynamic of the international business and its multidimensional nature requires the understanding of the complexities of different contexts dictated by cultural differences between countries. The purpose of this paper is to study, in depth howsmall and medium-sized companies develop their international marketing mix strategy in their overseas subsidiaries. We use the theoretical construct of Hofstede (1980, 1991) in the dimensions of Power Distance (PD), Uncertainty Avoidance (UA), Individualism (IND), Masculinity (MASC) and Long-Term Orientation (LTO) to assess the cross cultural differences between countries and the business practices to analyze the adaptation or standardization of the international marketing mix strategy of foreign Portuguese subsidiaries. Ourstudy uses an exploratoryand qualitative methodology. We conducted semi-structured interviews in order to achieve a good understanding ofinternational marketing mix strategy of four companies from different sectors. Our results show that the national cultural differences have great influence in the marketing strategy of the subsidiary. The business practices adjustments in the subsidiary company that proved to be necessary conditions for their performance are conducted by the products augmented offerings concerning the characteristics of the product, design and brand name in order to meet the requirements and specificities of the host country of the subsidiary.
Resumo:
A backside protein-surface imprinting process is presented herein as a novel way to generate specific synthetic antibody materials. The template is covalently bonded to a carboxylated-PVC supporting film previously cast on gold, let to interact with charged monomers and surrounded next by another thick polymer. This polymer is then covalently attached to a transducing element and the backside of this structure (supporting film plus template) is removed as a regular “tape”. The new sensing layer is exposed after the full template removal, showing a high density of re-binding positions, as evidenced by SEM. To ensure that the templates have been efficiently removed, this re-binding layer was cleaned further with a proteolytic enzyme and solution washout. The final material was named MAPS, as in the back-side reading of SPAM, because it acts as a back-side imprinting of this recent approach. It was able to generate, for the first time, a specific response to a complex biomolecule from a synthetic material. Non-imprinted materials (NIMs) were also produced as blank and were used as a control of the imprinting process. All chemical modifications were followed by electrochemical techniques. This was done on a supporting film and transducing element of both MAPS and NIM. Only the MAPS-based device responded to oxLDL and the sensing layer was insensitive to other serum proteins, such as myoglobin and haemoglobin. Linear behaviour between log(C, μg mL−1) versus charged tranfer resistance (RCT, Ω) was observed by electrochemical impedance spectroscopy (EIS). Calibrations made in Fetal Calf Serum (FCS) were linear from 2.5 to 12.5 μg mL−1 (RCT = 946.12 × log C + 1590.7) with an R-squared of 0.9966. Overall, these were promising results towards the design of materials acting close to the natural antibodies and applied to practical use of clinical interest.
Resumo:
This paper describes the TURTLE project that aim to develop sub-systems with the capability of deep-sea long-term presence. Our motivation is to produce new robotic ascend and descend energy efficient technologies to be incorporated in robotic vehicles used by civil and military stakeholders for underwater operations. TURTLE contribute to the sustainable presence and operations in the sea bottom. Long term presence on sea bottom, increased awareness and operation capabilities in underwater sea and in particular on benthic deeps can only be achieved through the use of advanced technologies, leading to automation of operation, reducing operational costs and increasing efficiency of human activity.
Resumo:
Stroke is one of the most common conditions requiring rehabilitation, and its motor impairments are a major cause of permanent disability. Hemiparesis is observed by 80% of the patients after acute stroke. Neuroimaging studies showed that real and imagined movements have similarities regarding brain activation, supplying evidence that those similarities are based on the same process. Within this context, the combination of mental practice (MP) with physical and occupational therapy appears to be a natural complement based on neurorehabilitation concepts. Our study seeks to investigate if MP for stroke rehabilitation of upper limbs is an effective adjunct therapy. PubMed (Medline), ISI knowledge (Institute for Scientific Information) and SciELO (Scientific Electronic Library) were terminated on 20 February 2015. Data were collected on variables as follows: sample size, type of supervision, configuration of mental practice, setting the physical practice (intensity, number of sets and repetitions, duration of contractions, rest interval between sets, weekly and total duration), measures of sensorimotor deficits used in the main studies and significant results. Random effects models were used that take into account the variance within and between studies. Seven articles were selected. As there was no statistically significant difference between the two groups (MP vs control), showed a - 0.6 (95% CI: -1.27 to 0.04), for upper limb motor restoration after stroke. The present meta-analysis concluded that MP is not effective as adjunct therapeutic strategy for upper limb motor restoration after stroke.
Resumo:
Cloud data centers have been progressively adopted in different scenarios, as reflected in the execution of heterogeneous applications with diverse workloads and diverse quality of service (QoS) requirements. Virtual machine (VM) technology eases resource management in physical servers and helps cloud providers achieve goals such as optimization of energy consumption. However, the performance of an application running inside a VM is not guaranteed due to the interference among co-hosted workloads sharing the same physical resources. Moreover, the different types of co-hosted applications with diverse QoS requirements as well as the dynamic behavior of the cloud makes efficient provisioning of resources even more difficult and a challenging problem in cloud data centers. In this paper, we address the problem of resource allocation within a data center that runs different types of application workloads, particularly CPU- and network-intensive applications. To address these challenges, we propose an interference- and power-aware management mechanism that combines a performance deviation estimator and a scheduling algorithm to guide the resource allocation in virtualized environments. We conduct simulations by injecting synthetic workloads whose characteristics follow the last version of the Google Cloud tracelogs. The results indicate that our performance-enforcing strategy is able to fulfill contracted SLAs of real-world environments while reducing energy costs by as much as 21%.
Resumo:
Innovation is recognized by academics and practitioners as an essential competitive enabler for any company to survive, to remain competitive and to grow. Investments in tasks of R&D have not always brought the expected results. But that doesn't mean that the outcomes would not be useful to other companies of the same business area or even from another area. Thus, there is much knowledge already available in the market that can be helpful to some and profitable to others. So, the ideas and expertise can be found outside a company's boundaries and also exported from within. Information, knowledge, experience, wisdom is already available in the millions of the human beings of this planet, the challenge is to use them through a network to produce new ideas and tips that can be useful to a company with less costs. This was the reason for the emergence of the area of crowdsourcing innovation. Crowdsourcing innovation is a way of using the Web 2.0 tools to generate new ideas through the heterogeneous knowledge available in the global network of individuals highly qualified and with easy access to information and technology. So, a crowdsourcing innovation broker is an organization that mediates the communication and relationship between the seekers - companies that aspire to solve some problem or to take advantage of any business opportunity - with a crowd that is prone to give ideas based on their knowledge, experience and wisdom. This paper makes a literature review on models of open innovation, crowdsourcing innovation, and technology and knowledge intermediaries, and discusses this new phenomenon as a way to leverage the innovation capacity of enterprises. Finally, the paper outlines a research design agendafor explaining crowdsourcing innovation brokering phenomenon, exploiting its players, main functions, value creation process, and knowledge creation in order to define a knowledge metamodel of such intermediaries.