809 resultados para Real-world process
Resumo:
Modeling Extract-Transform-Load (ETL) processes of a Data Warehousing System has always been a challenge. The heterogeneity of the sources, the quality of the data obtained and the conciliation process are some of the issues that must be addressed in the design phase of this critical component. Commercial ETL tools often provide proprietary diagrammatic components and modeling languages that are not standard, thus not providing the ideal separation between a modeling platform and an execution platform. This separation in conjunction with the use of standard notations and languages is critical in a system that tends to evolve through time and which cannot be undermined by a normally expensive tool that becomes an unsatisfactory component. In this paper we demonstrate the application of Relational Algebra as a modeling language of an ETL system as an effort to standardize operations and provide a basis for uncommon ETL execution platforms.
Resumo:
The incorporation of the Spanish university system into the European Higher Education Areahas brought about a series of adaptations. Among the recommendations is the inclusion ofan external training period in a company, which has resulted in significant changes in thedegree syllabus in order to balance the theoretical and practical education required by thestudents. This new framework has been legally confirmed by the Spanish Government and, inthe case of the University of Barcelona, by the publication of internal guidelines. Takingadvantage of this new opportunity to adapt the Pharmacy degree to real-world problems inindustry, the Dean’s team of the Faculty of Pharmacy, with the support of the Facultyadministrative staff and the Students Advisory Service, have assumed the challenge ofincluding a new subject in the syllabus entitled Training in Companies.In parallel, a new activity has been set up to ensure that the students choose the mostsuitable company department/job for them and to help them pass the company interview.Under the name of Passport to a Profession, a series of ten explanatory talks has beenscheduled every academic year. These talks deal with a broad range of topics aimed atproviding the students with the basic tools they will need to make the most of a companytraining period and to make headway in the professional world when they finish theirdegree. In addition, three Faculty of Pharmacy-Pharmaceutical company workshops and tworound-table conferences have been held in the last two years in order to bring the universityand industry together. Notably, the project to provide students with company training isexpanding on an international level, with two to three undergraduate students contractedevery year by a United Kingdom-based multinational pharmaceutical company.The statistical data of the whole process has been analysed for a more in-depthunderstanding of the activity and to improve the programme.
Resumo:
Designing is a heterogeneous, fuzzily defined, floating field of various activities and chunks of ideas and knowledge. Available theories about the foundations of designing as presented in "the basic PARADOX" (Jonas and Meyer-Veden 2004) have evoked the impression of Babylonian confusion. We located the reasons for this "mess" in the "non-fit", which is the problematic relation of theories and subject field. There seems to be a comparable interface problem in theory-building as in designing itself. "Complexity" sounds promising, but turns out to be a problematic and not really helpful concept. I will argue for a more precise application of systemic and evolutionary concepts instead, which - in my view - are able to model the underlying generative structures and processes that produce the visible phenomenon of complexity. It does not make sense to introduce a new fashionable meta-concept and to hope for a panacea before having clarified the more basic and still equally problematic older meta-concepts. This paper will take one step away from "theories of what" towards practice and doing and try to have a closer look at existing process models or "theories of how" to design instead. Doing this from a systemic perspective leads to an evolutionary view of the process, which finally allows to specify more clearly the "knowledge gaps" inherent in the design process. This aspect has to be taken into account as constitutive of any attempt at theory-building in design, which can be characterized as a "practice of not-knowing". I conclude, that comprehensive "unified" theories, or methods, or process models run aground on the identified knowledge gaps, which allow neither reliable models of the present, nor reliable projections into the future. Consolation may be found in performing a shift from the effort of adaptation towards strategies of exaptation, which means the development of stocks of alternatives for coping with unpredictable situations in the future.
Resumo:
The classical computer vision methods can only weakly emulate some of the multi-level parallelisms in signal processing and information sharing that takes place in different parts of the primates’ visual system thus enabling it to accomplish many diverse functions of visual perception. One of the main functions of the primates’ vision is to detect and recognise objects in natural scenes despite all the linear and non-linear variations of the objects and their environment. The superior performance of the primates’ visual system compared to what machine vision systems have been able to achieve to date, motivates scientists and researchers to further explore this area in pursuit of more efficient vision systems inspired by natural models. In this paper building blocks for a hierarchical efficient object recognition model are proposed. Incorporating the attention-based processing would lead to a system that will process the visual data in a non-linear way focusing only on the regions of interest and hence reducing the time to achieve real-time performance. Further, it is suggested to modify the visual cortex model for recognizing objects by adding non-linearities in the ventral path consistent with earlier discoveries as reported by researchers in the neuro-physiology of vision.
Resumo:
Logistics involves planning, managing, and organizing the flows of goods from the point of origin to the point of destination in order to meet some requirements. Logistics and transportation aspects are very important and represent a relevant costs for producing and shipping companies, but also for public administration and private citizens. The optimization of resources and the improvement in the organization of operations is crucial for all branches of logistics, from the operation management to the transportation. As we will have the chance to see in this work, optimization techniques, models, and algorithms represent important methods to solve the always new and more complex problems arising in different segments of logistics. Many operation management and transportation problems are related to the optimization class of problems called Vehicle Routing Problems (VRPs). In this work, we consider several real-world deterministic and stochastic problems that are included in the wide class of the VRPs, and we solve them by means of exact and heuristic methods. We treat three classes of real-world routing and logistics problems. We deal with one of the most important tactical problems that arises in the managing of the bike sharing systems, that is the Bike sharing Rebalancing Problem (BRP). We propose models and algorithms for real-world earthwork optimization problems. We describe the 3DP process and we highlight several optimization issues in 3DP. Among those, we define the problem related to the tool path definition in the 3DP process, the 3D Routing Problem (3DRP), which is a generalization of the arc routing problem. We present an ILP model and several heuristic algorithms to solve the 3DRP.
Resumo:
A major application of computers has been to control physical processes in which the computer is embedded within some large physical process and is required to control concurrent physical processes. The main difficulty with these systems is their event-driven characteristics, which complicate their modelling and analysis. Although a number of researchers in the process system community have approached the problems of modelling and analysis of such systems, there is still a lack of standardised software development formalisms for the system (controller) development, particular at early stage of the system design cycle. This research forms part of a larger research programme which is concerned with the development of real-time process-control systems in which software is used to control concurrent physical processes. The general objective of the research in this thesis is to investigate the use of formal techniques in the analysis of such systems at their early stages of development, with a particular bias towards an application to high speed machinery. Specifically, the research aims to generate a standardised software development formalism for real-time process-control systems, particularly for software controller synthesis. In this research, a graphical modelling formalism called Sequential Function Chart (SFC), a variant of Grafcet, is examined. SFC, which is defined in the international standard IEC1131 as a graphical description language, has been used widely in industry and has achieved an acceptable level of maturity and acceptance. A comparative study between SFC and Petri nets is presented in this thesis. To overcome identified inaccuracies in the SFC, a formal definition of the firing rules for SFC is given. To provide a framework in which SFC models can be analysed formally, an extended time-related Petri net model for SFC is proposed and the transformation method is defined. The SFC notation lacks a systematic way of synthesising system models from the real world systems. Thus a standardised approach to the development of real-time process control systems is required such that the system (software) functional requirements can be identified, captured, analysed. A rule-based approach and a method called system behaviour driven method (SBDM) are proposed as a development formalism for real-time process-control systems.
Resumo:
This study describes the pedagogical impact of real-world experimental projects undertaken as part of an advanced undergraduate Fluid Mechanics subject at an Australian university. The projects have been organised to complement traditional lectures and introduce students to the challenges of professional design, physical modelling, data collection and analysis. The physical model studies combine experimental, analytical and numerical work in order to develop students’ abilities to tackle real-world problems. A first study illustrates the differences between ideal and real fluid flow force predictions based upon model tests of buildings in a large size wind tunnel used for research and professional testing. A second study introduces the complexity arising from unsteady non-uniform wave loading on a sheltered pile. The teaching initiative is supported by feedback from undergraduate students. The pedagogy of the course and projects is discussed with reference to experiential, project-based and collaborative learning. The practical work complements traditional lectures and tutorials, and provides opportunities which cannot be learnt in the classroom, real or virtual. Student feedback demonstrates a strong interest for the project phases of the course. This was associated with greater motivation for the course, leading in turn to lower failure rates. In terms of learning outcomes, the primary aim is to enable students to deliver a professional report as the final product, where physical model data are compared to ideal-fluid flow calculations and real-fluid flow analyses. Thus the students are exposed to a professional design approach involving a high level of expertise in fluid mechanics, with sufficient academic guidance to achieve carefully defined learning goals, while retaining sufficient flexibility for students to construct there own learning goals. The overall pedagogy is a blend of problem-based and project-based learning, which reflects academic research and professional practice. The assessment is a mix of peer-assessed oral presentations and written reports that aims to maximise student reflection and development. Student feedback indicated a strong motivation for courses that include a well-designed project component.
Resumo:
AIM: This work presents detailed experimental performance results from tests executed in the hospital environment for Health Monitoring for All (HM4All), a remote vital signs monitoring system based on a ZigBee® (ZigBee Alliance, San Ramon, CA) body sensor network (BSN). MATERIALS AND METHODS: Tests involved the use of six electrocardiogram (ECG) sensors operating in two different modes: the ECG mode involved the transmission of ECG waveform data and heart rate (HR) values to the ZigBee coordinator, whereas the HR mode included only the transmission of HR values. In the absence of hidden nodes, a non-beacon-enabled star network composed of sensing devices working on ECG mode kept the delivery ratio (DR) at 100%. RESULTS: When the network topology was changed to a 2-hop tree, the performance degraded slightly, resulting in an average DR of 98.56%. Although these performance outcomes may seem satisfactory, further investigation demonstrated that individual sensing devices went through transitory periods with low DR. Other tests have shown that ZigBee BSNs are highly susceptible to collisions owing to hidden nodes. Nevertheless, these tests have also shown that these networks can achieve high reliability if the amount of traffic is kept low. Contrary to what is typically shown in scientific articles and in manufacturers' documentation, the test outcomes presented in this article include temporal graphs of the DR achieved by each wireless sensor device. CONCLUSIONS: The test procedure and the approach used to represent its outcomes, which allow the identification of undesirable transitory periods of low reliability due to contention between devices, constitute the main contribution of this work.
Resumo:
The idea behind creating this special issue on real world applications of intelligent tutoring systems was to bring together in a single publication some of the most important examples of success in the use of ITS technology. This will serve as a reference to all researchers working in the area. It will also be an important resource for the industry, showing the maturity of ITS technology and creating an atmosphere for funding new ITS projects. Simultaneously, it will be valuable to academic groups, motivating students for new ideas of ITS and promoting new academic research work in the area.
Resumo:
In a real world multiagent system, where the agents are faced with partial, incomplete and intrinsically dynamic knowledge, conflicts are inevitable. Frequently, different agents have goals or beliefs that cannot hold simultaneously. Conflict resolution methodologies have to be adopted to overcome such undesirable occurrences. In this paper we investigate the application of distributed belief revision techniques as the support for conflict resolution in the analysis of the validity of the candidate beams to be produced in the CERN particle accelerators. This CERN multiagent system contains a higher hierarchy agent, the Specialist agent, which makes use of meta-knowledge (on how the con- flicting beliefs have been produced by the other agents) in order to detect which beliefs should be abandoned. Upon solving a conflict, the Specialist instructs the involved agents to revise their beliefs accordingly. Conflicts in the problem domain are mapped into conflicting beliefs of the distributed belief revision system, where they can be handled by proven formal methods. This technique builds on well established concepts and combines them in a new way to solve important problems. We find this approach generally applicable in several domains.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Informática
Resumo:
Research and development around indoor positioning and navigation is capturing the attention of an increasing number of research groups and labs around the world. Among the several techniques being proposed for indoor positioning, solutions based on Wi-Fi fingerprinting are the most popular since they exploit existing WLAN infrastructures to support software-only positioning, tracking and navigation applications. Despite the enormous research efforts in this domain, and despite the existence of some commercial products based on Wi-Fi fingerprinting, it is still difficult to compare the performance, in the real world, of the several existing solutions. The EvAAL competition, hosted by the IPIN 2015 conference, contributed to fill this gap. This paper describes the experience of the RTLS@UM team in participating in track 3 of that competition.
Resumo:
Background: Drug-eluting stents have been used in daily practice since 2002, with the clear advantages of reducing the risk of target vessel revascularization and an impressive reduction in restenosis rate by 50%-70%. However, the occurrence of a late thrombosis can compromise long-term results, particularly if the risks of this event were sustained. In this context, a registry of clinical cases gains special value. Objective: To evaluate the efficacy and safety of drug-eluting stents in the real world. Methods: We report on the clinical findings and 8-year follow-up parameters of all patients that underwent percutaneous coronary intervention with a drug-eluting stent from January 2002 to April 2007. Drug-eluting stents were used in accordance with the clinical and interventional cardiologist decision and availability of the stent. Results: A total of 611 patients were included, and clinical follow-up of up to 8 years was obtained for 96.2% of the patients. Total mortality was 8.7% and nonfatal infarctions occurred in 4.3% of the cases. Target vessel revascularization occurred in 12.4% of the cases, and target lesion revascularization occurred in 8% of the cases. The rate of stent thrombosis was 2.1%. There were no new episodes of stent thrombosis after the fifth year of follow-up. Comparative subanalysis showed no outcome differences between the different types of stents used, including Cypher®, Taxus®, and Endeavor®. Conclusion: These findings indicate that drug-eluting stents remain safe and effective at very long-term follow-up. Patients in the "real world" may benefit from drug-eluting stenting with excellent, long-term results.