65 resultados para Schermi, adattativi, pervasive, kinect, framework, ingegnerizzazione, OpenNI


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the light of Portuguese legal system, cooperative enterprises may include an enterprise carried out by a subsidiary, provided they conform to certain requirements. The aim of this paper is to reflect on the issue of the legal framework of the relationship between the cooperative and the subsidiary. There are several problems to be addressed in this paper: (i) How to qualify such a relationship since corresponding to mere investments made by the cooperative? Should it be classified as non-member cooperative transactions or as extraordinary activities? (ii) How to qualify such a relationship when related to the development of preparatory or complementary activities for the economic activity developed between the cooperative and its members? May we speak, in this situation, of a concept of “indirect mutuality”, as provided in other legal systems? (iii) How should we classify and what is the regime of the economic results from the activity developed by the subsidiary? We will conclude, advocating: (i) That the cooperative enterprise may include an enterprise carried out by a subsidiary if this is deemed necessary to satisfy the interests of the members; (ii) The inadmissibility of the concept of “indirect mutuality”; (iii) The inadequacy of qualifying the legal relationship between the cooperative partner (iv) The application, to the economic results coming from the activity developed by the subsidiary, of the regime provided for in the Portuguese Cooperative Code to the results from non-member cooperative transactions; (v) The economic results coming from the activity developed by the subsidiary cannot be appropriated by individual co-operators members, and so should be allocated to indivisible reserves.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The last decade has witnessed a major shift towards the deployment of embedded applications on multi-core platforms. However, real-time applications have not been able to fully benefit from this transition, as the computational gains offered by multi-cores are often offset by performance degradation due to shared resources, such as main memory. To efficiently use multi-core platforms for real-time systems, it is hence essential to tightly bound the interference when accessing shared resources. Although there has been much recent work in this area, a remaining key problem is to address the diversity of memory arbiters in the analysis to make it applicable to a wide range of systems. This work handles diverse arbiters by proposing a general framework to compute the maximum interference caused by the shared memory bus and its impact on the execution time of the tasks running on the cores, considering different bus arbiters. Our novel approach clearly demarcates the arbiter-dependent and independent stages in the analysis of these upper bounds. The arbiter-dependent phase takes the arbiter and the task memory-traffic pattern as inputs and produces a model of the availability of the bus to a given task. Then, based on the availability of the bus, the arbiter-independent phase determines the worst-case request-release scenario that maximizes the interference experienced by the tasks due to the contention for the bus. We show that the framework addresses the diversity problem by applying it to a memory bus shared by a fixed-priority arbiter, a time-division multiplexing (TDM) arbiter, and an unspecified work-conserving arbiter using applications from the MediaBench test suite. We also experimentally evaluate the quality of the analysis by comparison with a state-of-the-art TDM analysis approach and consistently showing a considerable reduction in maximum interference.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

International Conference on Intelligent Sensors, Sensor Networks and Information Processing (ISSNIP 2015). 7 to 9, Apr, 2015. Singapure, Singapore.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

IEEE International Conference on Pervasive Computing and Communications (PerCom). 23 to 26, Mar, 2015, PhD Forum. Saint Louis, U.S.A..

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Currently, due to the widespread use of computers and the internet, students are trading libraries for the World Wide Web and laboratories with simulation programs. In most courses, simulators are made available to students and can be used to proof theoretical results or to test a developing hardware/product. Although this is an interesting solution: low cost, easy and fast way to perform some courses work, it has indeed major disadvantages. As everything is currently being done with/in a computer, the students are loosing the “feel” of the real values of the magnitudes. For instance in engineering studies, and mainly in the first years, students need to learn electronics, algorithmic, mathematics and physics. All of these areas can use numerical analysis software, simulation software or spreadsheets and in the majority of the cases data used is either simulated or random numbers, but real data could be used instead. For example, if a course uses numerical analysis software and needs a dataset, the students can learn to manipulate arrays. Also, when using the spreadsheets to build graphics, instead of using a random table, students could use a real dataset based, for instance, in the room temperature and its variation across the day. In this work we present a framework which uses a simple interface allowing it to be used by different courses where the computers are the teaching/learning process in order to give a more realistic feeling to students by using real data. A framework is proposed based on a set of low cost sensors for different physical magnitudes, e.g. temperature, light, wind speed, which are connected to a central server, that the students have access with an Ethernet protocol or are connected directly to the student computer/laptop. These sensors use the communication ports available such as: serial ports, parallel ports, Ethernet or Universal Serial Bus (USB). Since a central server is used, the students are encouraged to use sensor values results in their different courses and consequently in different types of software such as: numerical analysis tools, spreadsheets or simply inside any programming language when a dataset is needed. In order to do this, small pieces of hardware were developed containing at least one sensor using different types of computer communication. As long as the sensors are attached in a server connected to the internet, these tools can also be shared between different schools. This allows sensors that aren't available in a determined school to be used by getting the values from other places that are sharing them. Another remark is that students in the more advanced years and (theoretically) more know how, can use the courses that have some affinities with electronic development to build new sensor pieces and expand the framework further. The final solution provided is very interesting, low cost, simple to develop, allowing flexibility of resources by using the same materials in several courses bringing real world data into the students computer works.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OCEANS, 2001. MTS/IEEE Conference and Exhibition (Volume:2 )

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A control framework enabling the automated maneuvering of a Remotely Operate Vehicle (ROV) is presented. The control architecture is structured according to the principle of composition of vehicle motions from a minimal set of elemental maneuvers that are designed and verified independently. The principled approach is based on distributed hybrid systems techniques, and spans integrated design, simulation and implementation as the same model is used throughout. Hybrid systems control techniques are used to synthesize the elemental maneuvers and to design protocols, which coordinate the execution of elemental maneuvers within a complex maneuver. This work is part of the Inspection of Underwater Structures (IES) project whose main objective is the implementation of a ROV-based system for the inspection of underwater structures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

23rd International Conference on Real-Time Networks and Systems (RTNS 2015). 4 to 6, Nov, 2015, Main Track. Lille, France. Best Paper Award Nominee

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Article in Press, Corrected Proof

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study aims to analyze and compare four micro-firms' organizational culture, evaluated through the Competing Values Framework (Quinn & Rohbaugh, 1983). Data was collected in 2011 and 2013 in firms selling the same type of software and providing the same kind of services, focusing on the years between 2008-2011. Findings point to somewhat different results of micro-firms, when comparing to other samples in the literature. Suggestions for future research are given.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Presented at SEMINAR "ACTION TEMPS RÉEL:INFRASTRUCTURES ET SERVICES SYSTÉMES". 10, Apr, 2015. Brussels, Belgium.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The world is increasingly in a global community. The rapid technological development of communication and information technologies allows the transmission of knowledge in real-time. In this context, it is imperative that the most developed countries are able to develop their own strategies to stimulate the industrial sector to keep up-to-date and being competitive in a dynamic and volatile global market so as to maintain its competitive capacities and by consequence, permits the maintenance of a pacific social state to meet the human and social needs of the nation. The path traced of competitiveness through technological differentiation in industrialization allows a wider and innovative field of research. Already we are facing a new phase of organization and industrial technology that begins to change the way we relate with the industry, society and the human interaction in the world of work in current standards. This Thesis, develop an analysis of Industrie 4.0 Framework, Challenges and Perspectives. Also, an analysis of German reality in facing to approach the future challenge in this theme, the competition expected to win in future global markets, points of domestic concerns felt in its industrial fabric household face this challenge and proposes recommendations for a more effective implementation of its own strategy. The methods of research consisted of a comprehensive review and strategically analysis of existing global literature on the topic, either directly or indirectly, in parallel with the analysis of questionnaires and data analysis performed by entities representing the industry at national and world global placement. The results found by this multilevel analysis, allowed concluding that this is a theme that is only in the beginning for construction the platform to engage the future Internet of Things in the industrial environment Industrie 4.0. This dissertation allows stimulate the need of achievements of more strategically and operational approach within the society itself as a whole to clarify the existing weaknesses in this area, so that the National Strategy can be implemented with effective approaches and planned actions for a direct training plan in a more efficiently path in education for the theme.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The recent technological advancements and market trends are causing an interesting phenomenon towards the convergence of High-Performance Computing (HPC) and Embedded Computing (EC) domains. On one side, new kinds of HPC applications are being required by markets needing huge amounts of information to be processed within a bounded amount of time. On the other side, EC systems are increasingly concerned with providing higher performance in real-time, challenging the performance capabilities of current architectures. The advent of next-generation many-core embedded platforms has the chance of intercepting this converging need for predictable high-performance, allowing HPC and EC applications to be executed on efficient and powerful heterogeneous architectures integrating general-purpose processors with many-core computing fabrics. To this end, it is of paramount importance to develop new techniques for exploiting the massively parallel computation capabilities of such platforms in a predictable way. P-SOCRATES will tackle this important challenge by merging leading research groups from the HPC and EC communities. The time-criticality and parallelisation challenges common to both areas will be addressed by proposing an integrated framework for executing workload-intensive applications with real-time requirements on top of next-generation commercial-off-the-shelf (COTS) platforms based on many-core accelerated architectures. The project will investigate new HPC techniques that fulfil real-time requirements. The main sources of indeterminism will be identified, proposing efficient mapping and scheduling algorithms, along with the associated timing and schedulability analysis, to guarantee the real-time and performance requirements of the applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The complexity of systems is considered an obstacle to the progress of the IT industry. Autonomic computing is presented as the alternative to cope with the growing complexity. It is a holistic approach, in which the systems are able to configure, heal, optimize, and protect by themselves. Web-based applications are an example of systems where the complexity is high. The number of components, their interoperability, and workload variations are factors that may lead to performance failures or unavailability scenarios. The occurrence of these scenarios affects the revenue and reputation of businesses that rely on these types of applications. In this article, we present a self-healing framework for Web-based applications (SHõWA). SHõWA is composed by several modules, which monitor the application, analyze the data to detect and pinpoint anomalies, and execute recovery actions autonomously. The monitoring is done by a small aspect-oriented programming agent. This agent does not require changes to the application source code and includes adaptive and selective algorithms to regulate the level of monitoring. The anomalies are detected and pinpointed by means of statistical correlation. The data analysis detects changes in the server response time and analyzes if those changes are correlated with the workload or are due to a performance anomaly. In the presence of per- formance anomalies, the data analysis pinpoints the anomaly. Upon the pinpointing of anomalies, SHõWA executes a recovery procedure. We also present a study about the detection and localization of anomalies, the accuracy of the data analysis, and the performance impact induced by SHõWA. Two benchmarking applications, exercised through dynamic workloads, and different types of anomaly were considered in the study. The results reveal that (1) the capacity of SHõWA to detect and pinpoint anomalies while the number of end users affected is low; (2) SHõWA was able to detect anomalies without raising any false alarm; and (3) SHõWA does not induce a significant performance overhead (throughput was affected in less than 1%, and the response time delay was no more than 2 milliseconds).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A crowdsourcing innovation intermediary performs mediation activities between companies that have a problem to solve or that seek a business opportunity, and a group of people motivated to present ideas based on their knowledge, experience and wisdom, taking advantage of technology sharing and collaboration emerging from Web2.0. As far as we know, most of the present intermediaries don´t have, yet, an integrated vision that combines the creation of value through community development, brokering and technology transfer. In this paper we present a proposal of a knowledge repository framework for crowdsourcing innovation that enables effective support and integration of the activities developed in the process of value creation (community building, brokering and technology transfer), modeled using ontology engineering methods.